Sep 30 07:33:33 crc systemd[1]: Starting Kubernetes Kubelet... Sep 30 07:33:33 crc restorecon[4668]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:33 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 07:33:34 crc restorecon[4668]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 07:33:34 crc restorecon[4668]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Sep 30 07:33:34 crc kubenswrapper[4760]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 07:33:34 crc kubenswrapper[4760]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Sep 30 07:33:34 crc kubenswrapper[4760]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 07:33:34 crc kubenswrapper[4760]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 07:33:34 crc kubenswrapper[4760]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 30 07:33:34 crc kubenswrapper[4760]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.789776 4760 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.794856 4760 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.794880 4760 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.794886 4760 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.794890 4760 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.794894 4760 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.794898 4760 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.794902 4760 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.794906 4760 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.794911 4760 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.794916 4760 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.794921 4760 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.794926 4760 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.794932 4760 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.794938 4760 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.794942 4760 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.794954 4760 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.794959 4760 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.794964 4760 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.794967 4760 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.794971 4760 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.794975 4760 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.794979 4760 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.794982 4760 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.794986 4760 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.794990 4760 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.794993 4760 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.794997 4760 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795001 4760 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795005 4760 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795008 4760 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795011 4760 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795015 4760 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795018 4760 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795022 4760 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795025 4760 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795029 4760 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795032 4760 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795035 4760 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795040 4760 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795044 4760 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795049 4760 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795052 4760 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795056 4760 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795060 4760 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795064 4760 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795067 4760 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795071 4760 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795074 4760 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795078 4760 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795081 4760 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795085 4760 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795095 4760 feature_gate.go:330] unrecognized feature gate: Example Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795099 4760 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795102 4760 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795106 4760 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795109 4760 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795113 4760 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795117 4760 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795121 4760 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795125 4760 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795128 4760 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795131 4760 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795135 4760 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795139 4760 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795142 4760 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795145 4760 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795149 4760 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795153 4760 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795158 4760 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795161 4760 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.795165 4760 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795298 4760 flags.go:64] FLAG: --address="0.0.0.0" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795326 4760 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795350 4760 flags.go:64] FLAG: --anonymous-auth="true" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795356 4760 flags.go:64] FLAG: --application-metrics-count-limit="100" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795363 4760 flags.go:64] FLAG: --authentication-token-webhook="false" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795367 4760 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795374 4760 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795381 4760 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795386 4760 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795390 4760 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795396 4760 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795400 4760 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795404 4760 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795409 4760 flags.go:64] FLAG: --cgroup-root="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795413 4760 flags.go:64] FLAG: --cgroups-per-qos="true" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795417 4760 flags.go:64] FLAG: --client-ca-file="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795430 4760 flags.go:64] FLAG: --cloud-config="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795434 4760 flags.go:64] FLAG: --cloud-provider="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795439 4760 flags.go:64] FLAG: --cluster-dns="[]" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795448 4760 flags.go:64] FLAG: --cluster-domain="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795454 4760 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795460 4760 flags.go:64] FLAG: --config-dir="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795465 4760 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795471 4760 flags.go:64] FLAG: --container-log-max-files="5" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795478 4760 flags.go:64] FLAG: --container-log-max-size="10Mi" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795483 4760 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795488 4760 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795492 4760 flags.go:64] FLAG: --containerd-namespace="k8s.io" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795497 4760 flags.go:64] FLAG: --contention-profiling="false" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795501 4760 flags.go:64] FLAG: --cpu-cfs-quota="true" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795505 4760 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795509 4760 flags.go:64] FLAG: --cpu-manager-policy="none" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795513 4760 flags.go:64] FLAG: --cpu-manager-policy-options="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795520 4760 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795524 4760 flags.go:64] FLAG: --enable-controller-attach-detach="true" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795529 4760 flags.go:64] FLAG: --enable-debugging-handlers="true" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795533 4760 flags.go:64] FLAG: --enable-load-reader="false" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795537 4760 flags.go:64] FLAG: --enable-server="true" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795541 4760 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795554 4760 flags.go:64] FLAG: --event-burst="100" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795559 4760 flags.go:64] FLAG: --event-qps="50" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795563 4760 flags.go:64] FLAG: --event-storage-age-limit="default=0" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795567 4760 flags.go:64] FLAG: --event-storage-event-limit="default=0" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795572 4760 flags.go:64] FLAG: --eviction-hard="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795577 4760 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795581 4760 flags.go:64] FLAG: --eviction-minimum-reclaim="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795585 4760 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795590 4760 flags.go:64] FLAG: --eviction-soft="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795594 4760 flags.go:64] FLAG: --eviction-soft-grace-period="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795601 4760 flags.go:64] FLAG: --exit-on-lock-contention="false" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795605 4760 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795609 4760 flags.go:64] FLAG: --experimental-mounter-path="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795620 4760 flags.go:64] FLAG: --fail-cgroupv1="false" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795625 4760 flags.go:64] FLAG: --fail-swap-on="true" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795630 4760 flags.go:64] FLAG: --feature-gates="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795637 4760 flags.go:64] FLAG: --file-check-frequency="20s" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795642 4760 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795647 4760 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795653 4760 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795658 4760 flags.go:64] FLAG: --healthz-port="10248" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795664 4760 flags.go:64] FLAG: --help="false" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795669 4760 flags.go:64] FLAG: --hostname-override="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795674 4760 flags.go:64] FLAG: --housekeeping-interval="10s" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795678 4760 flags.go:64] FLAG: --http-check-frequency="20s" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795683 4760 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795688 4760 flags.go:64] FLAG: --image-credential-provider-config="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795692 4760 flags.go:64] FLAG: --image-gc-high-threshold="85" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795696 4760 flags.go:64] FLAG: --image-gc-low-threshold="80" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795700 4760 flags.go:64] FLAG: --image-service-endpoint="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795704 4760 flags.go:64] FLAG: --kernel-memcg-notification="false" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795708 4760 flags.go:64] FLAG: --kube-api-burst="100" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795712 4760 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795717 4760 flags.go:64] FLAG: --kube-api-qps="50" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795721 4760 flags.go:64] FLAG: --kube-reserved="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795725 4760 flags.go:64] FLAG: --kube-reserved-cgroup="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795729 4760 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795733 4760 flags.go:64] FLAG: --kubelet-cgroups="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795737 4760 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795741 4760 flags.go:64] FLAG: --lock-file="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795746 4760 flags.go:64] FLAG: --log-cadvisor-usage="false" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795750 4760 flags.go:64] FLAG: --log-flush-frequency="5s" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795757 4760 flags.go:64] FLAG: --log-json-info-buffer-size="0" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795763 4760 flags.go:64] FLAG: --log-json-split-stream="false" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795767 4760 flags.go:64] FLAG: --log-text-info-buffer-size="0" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795771 4760 flags.go:64] FLAG: --log-text-split-stream="false" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795776 4760 flags.go:64] FLAG: --logging-format="text" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795780 4760 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795785 4760 flags.go:64] FLAG: --make-iptables-util-chains="true" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795799 4760 flags.go:64] FLAG: --manifest-url="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795805 4760 flags.go:64] FLAG: --manifest-url-header="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795812 4760 flags.go:64] FLAG: --max-housekeeping-interval="15s" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795818 4760 flags.go:64] FLAG: --max-open-files="1000000" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795824 4760 flags.go:64] FLAG: --max-pods="110" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795829 4760 flags.go:64] FLAG: --maximum-dead-containers="-1" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795835 4760 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795839 4760 flags.go:64] FLAG: --memory-manager-policy="None" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795845 4760 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795851 4760 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795856 4760 flags.go:64] FLAG: --node-ip="192.168.126.11" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795861 4760 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795875 4760 flags.go:64] FLAG: --node-status-max-images="50" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795880 4760 flags.go:64] FLAG: --node-status-update-frequency="10s" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795886 4760 flags.go:64] FLAG: --oom-score-adj="-999" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795891 4760 flags.go:64] FLAG: --pod-cidr="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795896 4760 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795904 4760 flags.go:64] FLAG: --pod-manifest-path="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795910 4760 flags.go:64] FLAG: --pod-max-pids="-1" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795915 4760 flags.go:64] FLAG: --pods-per-core="0" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795919 4760 flags.go:64] FLAG: --port="10250" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795924 4760 flags.go:64] FLAG: --protect-kernel-defaults="false" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795928 4760 flags.go:64] FLAG: --provider-id="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795932 4760 flags.go:64] FLAG: --qos-reserved="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795936 4760 flags.go:64] FLAG: --read-only-port="10255" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795943 4760 flags.go:64] FLAG: --register-node="true" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795947 4760 flags.go:64] FLAG: --register-schedulable="true" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.795951 4760 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796136 4760 flags.go:64] FLAG: --registry-burst="10" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796141 4760 flags.go:64] FLAG: --registry-qps="5" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796145 4760 flags.go:64] FLAG: --reserved-cpus="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796148 4760 flags.go:64] FLAG: --reserved-memory="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796154 4760 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796158 4760 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796163 4760 flags.go:64] FLAG: --rotate-certificates="false" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796167 4760 flags.go:64] FLAG: --rotate-server-certificates="false" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796181 4760 flags.go:64] FLAG: --runonce="false" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796185 4760 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796190 4760 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796194 4760 flags.go:64] FLAG: --seccomp-default="false" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796198 4760 flags.go:64] FLAG: --serialize-image-pulls="true" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796204 4760 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796210 4760 flags.go:64] FLAG: --storage-driver-db="cadvisor" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796216 4760 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796222 4760 flags.go:64] FLAG: --storage-driver-password="root" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796227 4760 flags.go:64] FLAG: --storage-driver-secure="false" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796232 4760 flags.go:64] FLAG: --storage-driver-table="stats" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796236 4760 flags.go:64] FLAG: --storage-driver-user="root" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796240 4760 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796245 4760 flags.go:64] FLAG: --sync-frequency="1m0s" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796249 4760 flags.go:64] FLAG: --system-cgroups="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796253 4760 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796259 4760 flags.go:64] FLAG: --system-reserved-cgroup="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796263 4760 flags.go:64] FLAG: --tls-cert-file="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796267 4760 flags.go:64] FLAG: --tls-cipher-suites="[]" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796275 4760 flags.go:64] FLAG: --tls-min-version="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796280 4760 flags.go:64] FLAG: --tls-private-key-file="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796286 4760 flags.go:64] FLAG: --topology-manager-policy="none" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796290 4760 flags.go:64] FLAG: --topology-manager-policy-options="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796294 4760 flags.go:64] FLAG: --topology-manager-scope="container" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796298 4760 flags.go:64] FLAG: --v="2" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796321 4760 flags.go:64] FLAG: --version="false" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796326 4760 flags.go:64] FLAG: --vmodule="" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796332 4760 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796336 4760 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796486 4760 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796494 4760 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796499 4760 feature_gate.go:330] unrecognized feature gate: Example Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796503 4760 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796507 4760 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796511 4760 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796515 4760 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796525 4760 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796529 4760 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796535 4760 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796538 4760 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796542 4760 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796545 4760 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796549 4760 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796552 4760 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796556 4760 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796559 4760 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796563 4760 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796566 4760 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796571 4760 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796575 4760 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796578 4760 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796582 4760 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796585 4760 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796590 4760 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796594 4760 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796598 4760 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796602 4760 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796605 4760 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796608 4760 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796613 4760 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796624 4760 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796628 4760 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796632 4760 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796638 4760 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796642 4760 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796647 4760 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796652 4760 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796656 4760 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796660 4760 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796664 4760 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796672 4760 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796677 4760 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796689 4760 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796694 4760 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796699 4760 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796705 4760 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796710 4760 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796715 4760 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796719 4760 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796724 4760 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796729 4760 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796733 4760 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796738 4760 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796742 4760 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796747 4760 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796755 4760 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796759 4760 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796763 4760 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796767 4760 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796771 4760 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796776 4760 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796780 4760 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796784 4760 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796788 4760 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796793 4760 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796798 4760 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796802 4760 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796806 4760 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796811 4760 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.796815 4760 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.796822 4760 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.814059 4760 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.814134 4760 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814297 4760 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814371 4760 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814383 4760 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814395 4760 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814405 4760 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814414 4760 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814423 4760 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814431 4760 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814440 4760 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814449 4760 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814457 4760 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814465 4760 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814472 4760 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814480 4760 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814488 4760 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814499 4760 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814510 4760 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814519 4760 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814529 4760 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814541 4760 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814550 4760 feature_gate.go:330] unrecognized feature gate: Example Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814558 4760 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814566 4760 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814574 4760 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814582 4760 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814590 4760 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814597 4760 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814605 4760 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814612 4760 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814620 4760 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814628 4760 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814636 4760 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814643 4760 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814650 4760 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814659 4760 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814667 4760 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814676 4760 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814683 4760 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814691 4760 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814699 4760 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814706 4760 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814714 4760 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814722 4760 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814729 4760 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814737 4760 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814745 4760 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814753 4760 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814762 4760 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814773 4760 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814785 4760 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814794 4760 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814806 4760 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814815 4760 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814823 4760 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814832 4760 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814840 4760 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814848 4760 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814855 4760 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814863 4760 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814870 4760 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814881 4760 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814890 4760 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814900 4760 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814909 4760 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814918 4760 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814926 4760 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814933 4760 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814941 4760 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814949 4760 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814956 4760 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.814964 4760 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.814978 4760 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815219 4760 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815233 4760 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815242 4760 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815250 4760 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815259 4760 feature_gate.go:330] unrecognized feature gate: Example Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815267 4760 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815275 4760 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815283 4760 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815291 4760 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815322 4760 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815331 4760 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815339 4760 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815348 4760 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815356 4760 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815364 4760 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815372 4760 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815380 4760 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815387 4760 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815395 4760 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815403 4760 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815411 4760 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815420 4760 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815428 4760 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815435 4760 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815443 4760 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815451 4760 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815458 4760 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815466 4760 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815473 4760 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815481 4760 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815489 4760 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815496 4760 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815504 4760 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815514 4760 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815521 4760 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815529 4760 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815538 4760 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815545 4760 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815553 4760 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815563 4760 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815573 4760 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815582 4760 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815591 4760 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815601 4760 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815610 4760 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815619 4760 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815627 4760 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815636 4760 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815643 4760 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815651 4760 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815659 4760 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815667 4760 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815674 4760 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815682 4760 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815690 4760 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815700 4760 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815710 4760 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815720 4760 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815731 4760 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815741 4760 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815750 4760 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815759 4760 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815768 4760 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815776 4760 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815784 4760 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815792 4760 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815800 4760 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815810 4760 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815820 4760 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815829 4760 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.815841 4760 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.815856 4760 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.816183 4760 server.go:940] "Client rotation is on, will bootstrap in background" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.822592 4760 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.822749 4760 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.824530 4760 server.go:997] "Starting client certificate rotation" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.824571 4760 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.824808 4760 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-25 13:34:12.2169759 +0000 UTC Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.824954 4760 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1350h0m37.392027564s for next certificate rotation Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.856031 4760 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.859213 4760 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.879052 4760 log.go:25] "Validated CRI v1 runtime API" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.918535 4760 log.go:25] "Validated CRI v1 image API" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.921176 4760 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.926944 4760 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-09-30-07-18-49-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.927018 4760 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.963413 4760 manager.go:217] Machine: {Timestamp:2025-09-30 07:33:34.959718816 +0000 UTC m=+0.602625298 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:ef0efcac-382e-4544-a77e-6adf149d5981 BootID:5c02496b-3bdb-4a64-91fc-57c59208ba25 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:41:2a:7b Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:41:2a:7b Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:f5:49:46 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:16:b7:22 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:49:7b:fa Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:dd:99:21 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:0a:eb:f2:04:9d:eb Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ea:2b:2f:1c:50:e8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.963864 4760 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.964124 4760 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.966481 4760 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.966763 4760 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.966816 4760 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.967128 4760 topology_manager.go:138] "Creating topology manager with none policy" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.967143 4760 container_manager_linux.go:303] "Creating device plugin manager" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.967675 4760 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.967723 4760 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.968129 4760 state_mem.go:36] "Initialized new in-memory state store" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.968257 4760 server.go:1245] "Using root directory" path="/var/lib/kubelet" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.972143 4760 kubelet.go:418] "Attempting to sync node with API server" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.972176 4760 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.972220 4760 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.972240 4760 kubelet.go:324] "Adding apiserver pod source" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.972256 4760 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.977209 4760 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.978645 4760 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.980268 4760 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Sep 30 07:33:34 crc kubenswrapper[4760]: E0930 07:33:34.980442 4760 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Sep 30 07:33:34 crc kubenswrapper[4760]: W0930 07:33:34.980641 4760 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Sep 30 07:33:34 crc kubenswrapper[4760]: E0930 07:33:34.980890 4760 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.981632 4760 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.983513 4760 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.983563 4760 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.983583 4760 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.983600 4760 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.983631 4760 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.983651 4760 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.983671 4760 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.983702 4760 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.983726 4760 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.983748 4760 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.983951 4760 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.983979 4760 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.985358 4760 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.986337 4760 server.go:1280] "Started kubelet" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.987722 4760 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.987722 4760 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 30 07:33:34 crc systemd[1]: Started Kubernetes Kubelet. Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.988859 4760 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.989183 4760 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.989417 4760 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.989452 4760 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.990029 4760 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 14:21:27.436605589 +0000 UTC Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.990092 4760 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 2142h47m52.446518074s for next certificate rotation Sep 30 07:33:34 crc kubenswrapper[4760]: E0930 07:33:34.990054 4760 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.990701 4760 volume_manager.go:287] "The desired_state_of_world populator starts" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.990879 4760 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 30 07:33:34 crc kubenswrapper[4760]: I0930 07:33:34.991382 4760 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Sep 30 07:33:34 crc kubenswrapper[4760]: E0930 07:33:34.999289 4760 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="200ms" Sep 30 07:33:35 crc kubenswrapper[4760]: W0930 07:33:35.001533 4760 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Sep 30 07:33:35 crc kubenswrapper[4760]: E0930 07:33:35.001759 4760 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.002898 4760 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.002953 4760 factory.go:55] Registering systemd factory Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.002972 4760 factory.go:221] Registration of the systemd container factory successfully Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.003803 4760 factory.go:153] Registering CRI-O factory Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.003836 4760 factory.go:221] Registration of the crio container factory successfully Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.003888 4760 factory.go:103] Registering Raw factory Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.003993 4760 server.go:460] "Adding debug handlers to kubelet server" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.003923 4760 manager.go:1196] Started watching for new ooms in manager Sep 30 07:33:35 crc kubenswrapper[4760]: E0930 07:33:35.003438 4760 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.201:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1869ff199fdd9b4f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-09-30 07:33:34.986255183 +0000 UTC m=+0.629161625,LastTimestamp:2025-09-30 07:33:34.986255183 +0000 UTC m=+0.629161625,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.009464 4760 manager.go:319] Starting recovery of all containers Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.019760 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.019872 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.019897 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.019918 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.019945 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.019972 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.019997 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020021 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020050 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020075 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020114 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020139 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020165 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020191 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020209 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020227 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020247 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020264 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020284 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020472 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020498 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020518 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020541 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020563 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020586 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020612 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020634 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020657 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020678 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020711 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020732 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020754 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020794 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020814 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020835 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020857 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020879 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020906 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020930 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020950 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020970 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.020993 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021013 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021033 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021053 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021074 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021096 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021118 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021149 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021175 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021202 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021229 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021269 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021332 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021369 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021401 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021430 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021461 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021490 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021517 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021541 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021564 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021586 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021606 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021627 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021647 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021666 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021685 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021705 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021724 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021746 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021765 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021784 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021804 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021823 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021842 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021864 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021883 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021902 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021922 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021941 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.021959 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.022087 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.022115 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.022134 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.022154 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.022175 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.022226 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.022254 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.022279 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.022338 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.022367 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.022396 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.022467 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.022497 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.022553 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.022576 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.022598 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.022624 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.022653 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.022679 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.022706 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.022735 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.022791 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.022908 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.022932 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.022953 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.022974 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.022993 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023013 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023033 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023081 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023101 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023122 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023142 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023160 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023180 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023202 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023228 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023409 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023433 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023509 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023528 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023555 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023575 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023631 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023651 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023706 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023725 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023742 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023759 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023776 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023793 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023810 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023888 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023972 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.023997 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024022 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024043 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024061 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024080 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024099 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024118 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024175 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024193 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024211 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024230 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024246 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024264 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024284 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024348 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024395 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024413 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024431 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024451 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024472 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024492 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024615 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024633 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024718 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024735 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024754 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024772 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024790 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024864 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024881 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024898 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024952 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024970 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.024988 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025005 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025022 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025042 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025059 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025077 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025133 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025151 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025198 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025278 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025296 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025338 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025357 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025377 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025425 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025443 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025462 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025482 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025530 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025549 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025595 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025612 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025663 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025711 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025729 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025747 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025821 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025840 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025857 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.025875 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.028716 4760 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.028778 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.028804 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.028833 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.028888 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.028950 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.028970 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.028989 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.029063 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.029078 4760 reconstruct.go:97] "Volume reconstruction finished" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.029088 4760 reconciler.go:26] "Reconciler: start to sync state" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.048346 4760 manager.go:324] Recovery completed Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.061194 4760 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.064876 4760 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.065046 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.065552 4760 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.065632 4760 kubelet.go:2335] "Starting kubelet main sync loop" Sep 30 07:33:35 crc kubenswrapper[4760]: E0930 07:33:35.065946 4760 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.067864 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.067924 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.067944 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:35 crc kubenswrapper[4760]: W0930 07:33:35.068050 4760 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Sep 30 07:33:35 crc kubenswrapper[4760]: E0930 07:33:35.068266 4760 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.073691 4760 cpu_manager.go:225] "Starting CPU manager" policy="none" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.073733 4760 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.073769 4760 state_mem.go:36] "Initialized new in-memory state store" Sep 30 07:33:35 crc kubenswrapper[4760]: E0930 07:33:35.091593 4760 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.101422 4760 policy_none.go:49] "None policy: Start" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.102941 4760 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.102977 4760 state_mem.go:35] "Initializing new in-memory state store" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.157702 4760 manager.go:334] "Starting Device Plugin manager" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.158641 4760 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.158661 4760 server.go:79] "Starting device plugin registration server" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.159265 4760 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.159283 4760 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.159475 4760 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.159645 4760 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.159655 4760 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.166662 4760 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.166860 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.168958 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.169005 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.169018 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.169173 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:35 crc kubenswrapper[4760]: E0930 07:33:35.169280 4760 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.169516 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.169556 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.169999 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.170034 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.170043 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.170138 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.170256 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.170286 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.170693 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.170732 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.170745 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.170763 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.170773 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.170750 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.170860 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.171135 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.171213 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.171795 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.171885 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.171912 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.171928 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.171952 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.171969 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.172182 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.172273 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.172354 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.173391 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.173434 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.173448 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.173838 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.173896 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.173915 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.173848 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.173974 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.173986 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.174203 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.174256 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.175326 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.175358 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.175392 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:35 crc kubenswrapper[4760]: E0930 07:33:35.201036 4760 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="400ms" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.232134 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.232170 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.232197 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.232215 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.232232 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.232252 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.232268 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.232286 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.232366 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.232413 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.232438 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.232550 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.232610 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.232633 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.232662 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.259875 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.261541 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.261606 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.261620 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.261665 4760 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 07:33:35 crc kubenswrapper[4760]: E0930 07:33:35.262424 4760 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.201:6443: connect: connection refused" node="crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.333748 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.334065 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.334246 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.334399 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.334453 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.334514 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.334552 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.334600 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.334602 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.334647 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.334705 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.334732 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.334746 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.334812 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.334819 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.334777 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.334848 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.334893 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.334931 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.334947 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.335006 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.334967 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.335149 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.335195 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.335240 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.335264 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.335297 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.335345 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.335378 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.335520 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.463052 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.470085 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.470139 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.470153 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.470186 4760 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 07:33:35 crc kubenswrapper[4760]: E0930 07:33:35.470866 4760 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.201:6443: connect: connection refused" node="crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.492682 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.506860 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.522513 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.543646 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.547534 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 07:33:35 crc kubenswrapper[4760]: W0930 07:33:35.548566 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-2eddd13666d164fcda2be432e76decd5bff667d2a284205973434e7339404305 WatchSource:0}: Error finding container 2eddd13666d164fcda2be432e76decd5bff667d2a284205973434e7339404305: Status 404 returned error can't find the container with id 2eddd13666d164fcda2be432e76decd5bff667d2a284205973434e7339404305 Sep 30 07:33:35 crc kubenswrapper[4760]: W0930 07:33:35.556853 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-0469e907b48042d2d8448f93cb4c53beae53de4216ebae8822c47fdf2b769e40 WatchSource:0}: Error finding container 0469e907b48042d2d8448f93cb4c53beae53de4216ebae8822c47fdf2b769e40: Status 404 returned error can't find the container with id 0469e907b48042d2d8448f93cb4c53beae53de4216ebae8822c47fdf2b769e40 Sep 30 07:33:35 crc kubenswrapper[4760]: W0930 07:33:35.562102 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-734453f3ef37d2da9d77a2c0c09c4629f8b66d62864c95ce81971a34ac526c78 WatchSource:0}: Error finding container 734453f3ef37d2da9d77a2c0c09c4629f8b66d62864c95ce81971a34ac526c78: Status 404 returned error can't find the container with id 734453f3ef37d2da9d77a2c0c09c4629f8b66d62864c95ce81971a34ac526c78 Sep 30 07:33:35 crc kubenswrapper[4760]: W0930 07:33:35.572084 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-478b9e157bd89d73bd26eb96243984f958103ad0cc0ba5a22de14bce91e1ce7b WatchSource:0}: Error finding container 478b9e157bd89d73bd26eb96243984f958103ad0cc0ba5a22de14bce91e1ce7b: Status 404 returned error can't find the container with id 478b9e157bd89d73bd26eb96243984f958103ad0cc0ba5a22de14bce91e1ce7b Sep 30 07:33:35 crc kubenswrapper[4760]: W0930 07:33:35.574933 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-90b808bf4759480b1596411b789510bd28f659bd74c3e55511488f1612a3889a WatchSource:0}: Error finding container 90b808bf4759480b1596411b789510bd28f659bd74c3e55511488f1612a3889a: Status 404 returned error can't find the container with id 90b808bf4759480b1596411b789510bd28f659bd74c3e55511488f1612a3889a Sep 30 07:33:35 crc kubenswrapper[4760]: E0930 07:33:35.602722 4760 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="800ms" Sep 30 07:33:35 crc kubenswrapper[4760]: W0930 07:33:35.868882 4760 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Sep 30 07:33:35 crc kubenswrapper[4760]: E0930 07:33:35.869031 4760 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.871454 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.873440 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.873495 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.873517 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.873557 4760 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 07:33:35 crc kubenswrapper[4760]: E0930 07:33:35.874115 4760 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.201:6443: connect: connection refused" node="crc" Sep 30 07:33:35 crc kubenswrapper[4760]: I0930 07:33:35.990887 4760 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Sep 30 07:33:36 crc kubenswrapper[4760]: I0930 07:33:36.078761 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2eddd13666d164fcda2be432e76decd5bff667d2a284205973434e7339404305"} Sep 30 07:33:36 crc kubenswrapper[4760]: I0930 07:33:36.080354 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"90b808bf4759480b1596411b789510bd28f659bd74c3e55511488f1612a3889a"} Sep 30 07:33:36 crc kubenswrapper[4760]: I0930 07:33:36.082028 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"478b9e157bd89d73bd26eb96243984f958103ad0cc0ba5a22de14bce91e1ce7b"} Sep 30 07:33:36 crc kubenswrapper[4760]: I0930 07:33:36.083470 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"734453f3ef37d2da9d77a2c0c09c4629f8b66d62864c95ce81971a34ac526c78"} Sep 30 07:33:36 crc kubenswrapper[4760]: I0930 07:33:36.085010 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"0469e907b48042d2d8448f93cb4c53beae53de4216ebae8822c47fdf2b769e40"} Sep 30 07:33:36 crc kubenswrapper[4760]: W0930 07:33:36.260925 4760 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Sep 30 07:33:36 crc kubenswrapper[4760]: E0930 07:33:36.262520 4760 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Sep 30 07:33:36 crc kubenswrapper[4760]: W0930 07:33:36.387189 4760 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Sep 30 07:33:36 crc kubenswrapper[4760]: E0930 07:33:36.387313 4760 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Sep 30 07:33:36 crc kubenswrapper[4760]: E0930 07:33:36.403370 4760 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="1.6s" Sep 30 07:33:36 crc kubenswrapper[4760]: W0930 07:33:36.587538 4760 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Sep 30 07:33:36 crc kubenswrapper[4760]: E0930 07:33:36.587718 4760 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Sep 30 07:33:36 crc kubenswrapper[4760]: I0930 07:33:36.674958 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:36 crc kubenswrapper[4760]: I0930 07:33:36.677515 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:36 crc kubenswrapper[4760]: I0930 07:33:36.677594 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:36 crc kubenswrapper[4760]: I0930 07:33:36.677622 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:36 crc kubenswrapper[4760]: I0930 07:33:36.677687 4760 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 07:33:36 crc kubenswrapper[4760]: E0930 07:33:36.678391 4760 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.201:6443: connect: connection refused" node="crc" Sep 30 07:33:36 crc kubenswrapper[4760]: I0930 07:33:36.990584 4760 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.091016 4760 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="2f5de75ed9124ec7a6cd0d20862001c06327d05876de38fdbb1270a9e9e8df54" exitCode=0 Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.091135 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"2f5de75ed9124ec7a6cd0d20862001c06327d05876de38fdbb1270a9e9e8df54"} Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.091255 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.092874 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.092965 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.092989 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.097658 4760 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4" exitCode=0 Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.097779 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4"} Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.097831 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.100765 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.100811 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.100832 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.102379 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c"} Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.102471 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30"} Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.102503 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78"} Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.104604 4760 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd" exitCode=0 Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.104685 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd"} Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.104780 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.106061 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.106141 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.106164 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.107401 4760 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="96e12039c0756f45b66e8293059b3a009c9b8b93c4dc273662570ed07ed35fbd" exitCode=0 Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.107480 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"96e12039c0756f45b66e8293059b3a009c9b8b93c4dc273662570ed07ed35fbd"} Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.107560 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.109207 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.109247 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.109263 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.110547 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.112098 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.112129 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.112142 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:37 crc kubenswrapper[4760]: W0930 07:33:37.877803 4760 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Sep 30 07:33:37 crc kubenswrapper[4760]: E0930 07:33:37.877914 4760 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Sep 30 07:33:37 crc kubenswrapper[4760]: I0930 07:33:37.989972 4760 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Sep 30 07:33:38 crc kubenswrapper[4760]: E0930 07:33:38.005174 4760 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="3.2s" Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.115027 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2"} Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.115121 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.116783 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.116862 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.116891 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.121030 4760 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2" exitCode=0 Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.121152 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2"} Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.121212 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.122396 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.122440 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.122460 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.123170 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"846ac3756cf7adb661714b6f1385401343264f6cc431072cfbfa02a806787efa"} Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.123201 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.124446 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.124491 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.124526 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.127519 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5ead42377ef36b0c38401509b29432dd9b6fdf73cc39f2dbea0930415492a4ee"} Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.127570 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f75ced5ba38de06feef6d8addd42a76b6b14cab6a67a351432fdb580eb0ca3c6"} Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.127600 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.127602 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d9171c7e60156e3ffabb5954ff097de7c4cb0629967cdddb44f16bf86ae38c0a"} Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.128818 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.128857 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.128876 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.131108 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648"} Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.131161 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f"} Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.131184 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e"} Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.279078 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.281349 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.281427 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.281441 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.281473 4760 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 07:33:38 crc kubenswrapper[4760]: E0930 07:33:38.282041 4760 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.201:6443: connect: connection refused" node="crc" Sep 30 07:33:38 crc kubenswrapper[4760]: W0930 07:33:38.409589 4760 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Sep 30 07:33:38 crc kubenswrapper[4760]: E0930 07:33:38.409675 4760 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Sep 30 07:33:38 crc kubenswrapper[4760]: I0930 07:33:38.806575 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 07:33:39 crc kubenswrapper[4760]: I0930 07:33:39.140404 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee"} Sep 30 07:33:39 crc kubenswrapper[4760]: I0930 07:33:39.140471 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882"} Sep 30 07:33:39 crc kubenswrapper[4760]: I0930 07:33:39.140484 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:39 crc kubenswrapper[4760]: I0930 07:33:39.141938 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:39 crc kubenswrapper[4760]: I0930 07:33:39.142019 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:39 crc kubenswrapper[4760]: I0930 07:33:39.142043 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:39 crc kubenswrapper[4760]: I0930 07:33:39.146520 4760 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08" exitCode=0 Sep 30 07:33:39 crc kubenswrapper[4760]: I0930 07:33:39.146657 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:39 crc kubenswrapper[4760]: I0930 07:33:39.146692 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 07:33:39 crc kubenswrapper[4760]: I0930 07:33:39.146730 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:39 crc kubenswrapper[4760]: I0930 07:33:39.146751 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:39 crc kubenswrapper[4760]: I0930 07:33:39.146748 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08"} Sep 30 07:33:39 crc kubenswrapper[4760]: I0930 07:33:39.148013 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:39 crc kubenswrapper[4760]: I0930 07:33:39.148364 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:39 crc kubenswrapper[4760]: I0930 07:33:39.148422 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:39 crc kubenswrapper[4760]: I0930 07:33:39.148440 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:39 crc kubenswrapper[4760]: I0930 07:33:39.149251 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:39 crc kubenswrapper[4760]: I0930 07:33:39.149423 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:39 crc kubenswrapper[4760]: I0930 07:33:39.149460 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:39 crc kubenswrapper[4760]: I0930 07:33:39.150195 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:39 crc kubenswrapper[4760]: I0930 07:33:39.150283 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:39 crc kubenswrapper[4760]: I0930 07:33:39.150350 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:39 crc kubenswrapper[4760]: I0930 07:33:39.153786 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:39 crc kubenswrapper[4760]: I0930 07:33:39.153836 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:39 crc kubenswrapper[4760]: I0930 07:33:39.153853 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:39 crc kubenswrapper[4760]: I0930 07:33:39.213221 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 07:33:40 crc kubenswrapper[4760]: I0930 07:33:40.155425 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a"} Sep 30 07:33:40 crc kubenswrapper[4760]: I0930 07:33:40.155536 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea"} Sep 30 07:33:40 crc kubenswrapper[4760]: I0930 07:33:40.155557 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88"} Sep 30 07:33:40 crc kubenswrapper[4760]: I0930 07:33:40.155613 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:40 crc kubenswrapper[4760]: I0930 07:33:40.156484 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 07:33:40 crc kubenswrapper[4760]: I0930 07:33:40.156546 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:40 crc kubenswrapper[4760]: I0930 07:33:40.156778 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:40 crc kubenswrapper[4760]: I0930 07:33:40.156848 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:40 crc kubenswrapper[4760]: I0930 07:33:40.156873 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:40 crc kubenswrapper[4760]: I0930 07:33:40.158073 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:40 crc kubenswrapper[4760]: I0930 07:33:40.158119 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:40 crc kubenswrapper[4760]: I0930 07:33:40.158136 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:40 crc kubenswrapper[4760]: I0930 07:33:40.396637 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 07:33:40 crc kubenswrapper[4760]: I0930 07:33:40.406064 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 07:33:41 crc kubenswrapper[4760]: I0930 07:33:41.166449 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21"} Sep 30 07:33:41 crc kubenswrapper[4760]: I0930 07:33:41.166535 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e"} Sep 30 07:33:41 crc kubenswrapper[4760]: I0930 07:33:41.166600 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:41 crc kubenswrapper[4760]: I0930 07:33:41.166663 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:41 crc kubenswrapper[4760]: I0930 07:33:41.166679 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 07:33:41 crc kubenswrapper[4760]: I0930 07:33:41.166777 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:41 crc kubenswrapper[4760]: I0930 07:33:41.168097 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:41 crc kubenswrapper[4760]: I0930 07:33:41.168187 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:41 crc kubenswrapper[4760]: I0930 07:33:41.168221 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:41 crc kubenswrapper[4760]: I0930 07:33:41.168937 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:41 crc kubenswrapper[4760]: I0930 07:33:41.168962 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:41 crc kubenswrapper[4760]: I0930 07:33:41.169001 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:41 crc kubenswrapper[4760]: I0930 07:33:41.169025 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:41 crc kubenswrapper[4760]: I0930 07:33:41.169004 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:41 crc kubenswrapper[4760]: I0930 07:33:41.169202 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:41 crc kubenswrapper[4760]: I0930 07:33:41.483196 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:41 crc kubenswrapper[4760]: I0930 07:33:41.485555 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:41 crc kubenswrapper[4760]: I0930 07:33:41.485628 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:41 crc kubenswrapper[4760]: I0930 07:33:41.485648 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:41 crc kubenswrapper[4760]: I0930 07:33:41.485695 4760 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 07:33:42 crc kubenswrapper[4760]: I0930 07:33:42.169788 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 07:33:42 crc kubenswrapper[4760]: I0930 07:33:42.169839 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:42 crc kubenswrapper[4760]: I0930 07:33:42.169847 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:42 crc kubenswrapper[4760]: I0930 07:33:42.171250 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:42 crc kubenswrapper[4760]: I0930 07:33:42.171293 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:42 crc kubenswrapper[4760]: I0930 07:33:42.171332 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:42 crc kubenswrapper[4760]: I0930 07:33:42.171686 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:42 crc kubenswrapper[4760]: I0930 07:33:42.171753 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:42 crc kubenswrapper[4760]: I0930 07:33:42.171797 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:42 crc kubenswrapper[4760]: I0930 07:33:42.823595 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 07:33:42 crc kubenswrapper[4760]: I0930 07:33:42.823927 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 07:33:42 crc kubenswrapper[4760]: I0930 07:33:42.824014 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:42 crc kubenswrapper[4760]: I0930 07:33:42.825823 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:42 crc kubenswrapper[4760]: I0930 07:33:42.825873 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:42 crc kubenswrapper[4760]: I0930 07:33:42.825891 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:42 crc kubenswrapper[4760]: I0930 07:33:42.827296 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 07:33:43 crc kubenswrapper[4760]: I0930 07:33:43.173465 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 07:33:43 crc kubenswrapper[4760]: I0930 07:33:43.173562 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:43 crc kubenswrapper[4760]: I0930 07:33:43.175285 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:43 crc kubenswrapper[4760]: I0930 07:33:43.175392 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:43 crc kubenswrapper[4760]: I0930 07:33:43.175412 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:43 crc kubenswrapper[4760]: I0930 07:33:43.245767 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 07:33:43 crc kubenswrapper[4760]: I0930 07:33:43.246099 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:43 crc kubenswrapper[4760]: I0930 07:33:43.247783 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:43 crc kubenswrapper[4760]: I0930 07:33:43.247845 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:43 crc kubenswrapper[4760]: I0930 07:33:43.247865 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:43 crc kubenswrapper[4760]: I0930 07:33:43.519448 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Sep 30 07:33:43 crc kubenswrapper[4760]: I0930 07:33:43.519720 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:43 crc kubenswrapper[4760]: I0930 07:33:43.521435 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:43 crc kubenswrapper[4760]: I0930 07:33:43.521490 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:43 crc kubenswrapper[4760]: I0930 07:33:43.521510 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:45 crc kubenswrapper[4760]: I0930 07:33:45.079205 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 07:33:45 crc kubenswrapper[4760]: I0930 07:33:45.079483 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:45 crc kubenswrapper[4760]: I0930 07:33:45.081291 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:45 crc kubenswrapper[4760]: I0930 07:33:45.081408 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:45 crc kubenswrapper[4760]: I0930 07:33:45.081427 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:45 crc kubenswrapper[4760]: E0930 07:33:45.169662 4760 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 07:33:45 crc kubenswrapper[4760]: I0930 07:33:45.477279 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 07:33:45 crc kubenswrapper[4760]: I0930 07:33:45.477561 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:45 crc kubenswrapper[4760]: I0930 07:33:45.479134 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:45 crc kubenswrapper[4760]: I0930 07:33:45.479187 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:45 crc kubenswrapper[4760]: I0930 07:33:45.479206 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:45 crc kubenswrapper[4760]: I0930 07:33:45.828010 4760 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 07:33:45 crc kubenswrapper[4760]: I0930 07:33:45.828141 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 30 07:33:45 crc kubenswrapper[4760]: I0930 07:33:45.830398 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Sep 30 07:33:45 crc kubenswrapper[4760]: I0930 07:33:45.830660 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:45 crc kubenswrapper[4760]: I0930 07:33:45.832492 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:45 crc kubenswrapper[4760]: I0930 07:33:45.832530 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:45 crc kubenswrapper[4760]: I0930 07:33:45.832544 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:48 crc kubenswrapper[4760]: W0930 07:33:48.600634 4760 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Sep 30 07:33:48 crc kubenswrapper[4760]: I0930 07:33:48.600856 4760 trace.go:236] Trace[1194718869]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 07:33:38.599) (total time: 10001ms): Sep 30 07:33:48 crc kubenswrapper[4760]: Trace[1194718869]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (07:33:48.600) Sep 30 07:33:48 crc kubenswrapper[4760]: Trace[1194718869]: [10.001515452s] [10.001515452s] END Sep 30 07:33:48 crc kubenswrapper[4760]: E0930 07:33:48.600915 4760 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Sep 30 07:33:48 crc kubenswrapper[4760]: E0930 07:33:48.777765 4760 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.1869ff199fdd9b4f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-09-30 07:33:34.986255183 +0000 UTC m=+0.629161625,LastTimestamp:2025-09-30 07:33:34.986255183 +0000 UTC m=+0.629161625,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Sep 30 07:33:48 crc kubenswrapper[4760]: I0930 07:33:48.991486 4760 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Sep 30 07:33:49 crc kubenswrapper[4760]: W0930 07:33:49.194789 4760 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Sep 30 07:33:49 crc kubenswrapper[4760]: I0930 07:33:49.194875 4760 trace.go:236] Trace[2079927844]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 07:33:39.193) (total time: 10001ms): Sep 30 07:33:49 crc kubenswrapper[4760]: Trace[2079927844]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (07:33:49.194) Sep 30 07:33:49 crc kubenswrapper[4760]: Trace[2079927844]: [10.001678415s] [10.001678415s] END Sep 30 07:33:49 crc kubenswrapper[4760]: E0930 07:33:49.194899 4760 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Sep 30 07:33:49 crc kubenswrapper[4760]: I0930 07:33:49.881809 4760 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 30 07:33:49 crc kubenswrapper[4760]: I0930 07:33:49.881909 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 30 07:33:49 crc kubenswrapper[4760]: I0930 07:33:49.900848 4760 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 30 07:33:49 crc kubenswrapper[4760]: I0930 07:33:49.900950 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 30 07:33:52 crc kubenswrapper[4760]: I0930 07:33:52.828375 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 07:33:52 crc kubenswrapper[4760]: I0930 07:33:52.828541 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:33:52 crc kubenswrapper[4760]: I0930 07:33:52.829609 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:33:52 crc kubenswrapper[4760]: I0930 07:33:52.829641 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:33:52 crc kubenswrapper[4760]: I0930 07:33:52.829656 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:33:52 crc kubenswrapper[4760]: I0930 07:33:52.834703 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 07:33:52 crc kubenswrapper[4760]: I0930 07:33:52.957182 4760 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Sep 30 07:33:53 crc kubenswrapper[4760]: I0930 07:33:53.027148 4760 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Sep 30 07:33:53 crc kubenswrapper[4760]: I0930 07:33:53.983206 4760 apiserver.go:52] "Watching apiserver" Sep 30 07:33:53 crc kubenswrapper[4760]: I0930 07:33:53.987696 4760 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Sep 30 07:33:53 crc kubenswrapper[4760]: I0930 07:33:53.988267 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-apiserver/kube-apiserver-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Sep 30 07:33:53 crc kubenswrapper[4760]: I0930 07:33:53.988872 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 07:33:53 crc kubenswrapper[4760]: I0930 07:33:53.989126 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:33:53 crc kubenswrapper[4760]: I0930 07:33:53.989184 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 07:33:53 crc kubenswrapper[4760]: I0930 07:33:53.989229 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:33:53 crc kubenswrapper[4760]: E0930 07:33:53.989343 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:33:53 crc kubenswrapper[4760]: E0930 07:33:53.989626 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:33:53 crc kubenswrapper[4760]: I0930 07:33:53.990100 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:33:53 crc kubenswrapper[4760]: I0930 07:33:53.990120 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 07:33:53 crc kubenswrapper[4760]: E0930 07:33:53.990199 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:33:53 crc kubenswrapper[4760]: I0930 07:33:53.992217 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Sep 30 07:33:53 crc kubenswrapper[4760]: I0930 07:33:53.992630 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Sep 30 07:33:53 crc kubenswrapper[4760]: I0930 07:33:53.993382 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Sep 30 07:33:53 crc kubenswrapper[4760]: I0930 07:33:53.993653 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Sep 30 07:33:53 crc kubenswrapper[4760]: I0930 07:33:53.993772 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Sep 30 07:33:53 crc kubenswrapper[4760]: I0930 07:33:53.993855 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Sep 30 07:33:53 crc kubenswrapper[4760]: I0930 07:33:53.993926 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Sep 30 07:33:53 crc kubenswrapper[4760]: I0930 07:33:53.993823 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Sep 30 07:33:53 crc kubenswrapper[4760]: I0930 07:33:53.994785 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Sep 30 07:33:53 crc kubenswrapper[4760]: I0930 07:33:53.994910 4760 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.057427 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.091893 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.107476 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.124498 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.141570 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.159957 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.181588 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.198705 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.207915 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 07:33:54 crc kubenswrapper[4760]: E0930 07:33:54.880275 4760 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.882125 4760 trace.go:236] Trace[209174697]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 07:33:42.072) (total time: 12809ms): Sep 30 07:33:54 crc kubenswrapper[4760]: Trace[209174697]: ---"Objects listed" error: 12809ms (07:33:54.881) Sep 30 07:33:54 crc kubenswrapper[4760]: Trace[209174697]: [12.809584515s] [12.809584515s] END Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.882355 4760 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.883457 4760 trace.go:236] Trace[242072586]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 07:33:43.761) (total time: 11122ms): Sep 30 07:33:54 crc kubenswrapper[4760]: Trace[242072586]: ---"Objects listed" error: 11122ms (07:33:54.883) Sep 30 07:33:54 crc kubenswrapper[4760]: Trace[242072586]: [11.122148362s] [11.122148362s] END Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.883598 4760 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Sep 30 07:33:54 crc kubenswrapper[4760]: E0930 07:33:54.884075 4760 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.885419 4760 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.934896 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.940585 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.941807 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.943624 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.947724 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.948441 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.961608 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.977382 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.985970 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986022 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986049 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986076 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986105 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986127 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986175 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986199 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986222 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986264 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986285 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986322 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986345 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986375 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986395 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986416 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986445 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986480 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986499 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986520 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986567 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986590 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986616 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986659 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986685 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986709 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986735 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986757 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986778 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986800 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986822 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986843 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986864 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986886 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986930 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986964 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.986984 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987003 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987025 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987045 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987069 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987090 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987109 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987130 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987150 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987172 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987192 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987214 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987236 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987259 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987240 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987280 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987319 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987343 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987386 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987407 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987413 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987434 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987513 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987519 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987594 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987619 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987647 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987666 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987685 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987710 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987729 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987737 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987749 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987772 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987779 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987798 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987844 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987867 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987890 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987911 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987939 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.987959 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988010 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988046 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988066 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988086 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988106 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988125 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988150 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988162 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988173 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988200 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988220 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988238 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988256 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988273 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988316 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988346 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988371 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988378 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988387 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988431 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988458 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988484 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988510 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988533 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988560 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988582 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988607 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988630 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988671 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988695 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988718 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988742 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988766 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988786 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988790 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988836 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988857 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988878 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.988903 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.992590 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.992640 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.992788 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.992831 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.992873 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.992913 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.992961 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.992994 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.993276 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.993323 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.993638 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.993853 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.993781 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.994115 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.994278 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.994150 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.994204 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.994448 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.994493 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.994739 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.994935 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.994993 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.995131 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.995336 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.995754 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.995748 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.996019 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.996208 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.997113 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.997345 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.997585 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.997598 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.998278 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:54 crc kubenswrapper[4760]: I0930 07:33:54.998357 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:54.999085 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.000424 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.000673 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.001070 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.001608 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.002122 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.002799 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.003321 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.003713 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.004348 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.004648 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.004981 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.005840 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.006171 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.006467 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.007092 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.007500 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.008258 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.008569 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.008812 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.009052 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.009570 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.015866 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.016157 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.016593 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.016681 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.021659 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.023330 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.023689 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.024673 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.024939 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.025216 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.025729 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.025913 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.026410 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.026681 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.026934 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.028769 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.029194 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.029484 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.031663 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.031895 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.032651 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.032868 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.033188 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.033220 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.033551 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.034050 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.034119 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.034890 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.040866 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.041053 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.041943 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.042267 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.043372 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.044257 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.046474 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.046593 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.046643 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.046677 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.046706 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.046737 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.046959 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047000 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047031 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047062 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047088 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047120 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047155 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047181 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047215 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047254 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047285 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047342 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047373 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047404 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047431 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047462 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047490 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047515 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047544 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047571 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047599 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047622 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047653 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047680 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047705 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047732 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047761 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047787 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047835 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047863 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047892 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047918 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047944 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047974 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.047999 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048030 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048060 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048087 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048118 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048158 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048196 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048222 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048250 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048279 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048326 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048356 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048383 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048414 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048439 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048466 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048495 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048521 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048550 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048578 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048603 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048631 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048658 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048686 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048712 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048742 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048771 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048794 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048829 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048874 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048924 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048950 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.048980 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.049008 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.049034 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.049064 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.049092 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.049129 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.049160 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.049189 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.049269 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.049471 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.049519 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.049549 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.049593 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.049628 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.049663 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.049693 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.049722 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.049773 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.049820 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.049852 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.049880 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.049911 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050054 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050078 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050097 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050113 4760 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050141 4760 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050161 4760 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050177 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050197 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050211 4760 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050225 4760 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050239 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050255 4760 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050402 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050420 4760 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050435 4760 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050454 4760 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050468 4760 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050483 4760 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050502 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050574 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050589 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050603 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050624 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050639 4760 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050652 4760 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050667 4760 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050686 4760 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050703 4760 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050719 4760 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050735 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050753 4760 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050768 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050783 4760 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050801 4760 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050817 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050831 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.050849 4760 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051232 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051252 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051267 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051284 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051324 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051339 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051354 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051374 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051387 4760 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051400 4760 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051414 4760 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051431 4760 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051446 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051459 4760 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051472 4760 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051493 4760 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051513 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051531 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051545 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051564 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051619 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051633 4760 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051650 4760 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051664 4760 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051676 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051691 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051708 4760 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051769 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051784 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051829 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051844 4760 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051861 4760 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.051758 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.052172 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.052403 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.052877 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.053855 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.054438 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.055981 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.056687 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.056884 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.056963 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.057199 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.057892 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.059019 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.059037 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.059628 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.059955 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.060121 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: E0930 07:33:55.062032 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:33:55.561996778 +0000 UTC m=+21.204903190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.064452 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.064527 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.065064 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.065118 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.065203 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.065331 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.068032 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.057337 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.069144 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.069530 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.070224 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.070545 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.069075 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.071228 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.071753 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.072282 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.072359 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.072545 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.072635 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.072781 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.073366 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.073942 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.073963 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.074635 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.077542 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.077928 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.078366 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.080899 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.080958 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.082230 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.082236 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.082705 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.083167 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.083471 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.083693 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.088608 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.088934 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.091394 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.091808 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.092107 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.092327 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.092639 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.092944 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.093167 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.093658 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.094125 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.094267 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.094413 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.094860 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.095705 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: E0930 07:33:55.095786 4760 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.095938 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.095999 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: E0930 07:33:55.096147 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 07:33:55.596123805 +0000 UTC m=+21.239030207 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.096157 4760 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.096341 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.096547 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.096650 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.096762 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.096850 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.096994 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.097079 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.097200 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.097468 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: E0930 07:33:55.097548 4760 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.097688 4760 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.097718 4760 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.097731 4760 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.097744 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.097757 4760 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.097768 4760 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.097781 4760 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.097796 4760 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.097808 4760 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.097822 4760 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.097832 4760 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.097842 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.097853 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.097863 4760 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.097873 4760 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.097883 4760 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.097894 4760 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.097904 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.097914 4760 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.097925 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.098199 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: E0930 07:33:55.098683 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 07:33:55.59864876 +0000 UTC m=+21.241555172 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.101607 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.102411 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.103202 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.103244 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.103529 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.103643 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.106533 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.106720 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.107100 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.108120 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.110757 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.111040 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.111167 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.112064 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.112347 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.112667 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.112686 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.115595 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.116504 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.117184 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.117490 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.119043 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.125703 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.126155 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.126278 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.127492 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.127634 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.131599 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.131630 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.131842 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.132219 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: E0930 07:33:55.133333 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 07:33:55 crc kubenswrapper[4760]: E0930 07:33:55.133359 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 07:33:55 crc kubenswrapper[4760]: E0930 07:33:55.133374 4760 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:33:55 crc kubenswrapper[4760]: E0930 07:33:55.133439 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 07:33:55.633420053 +0000 UTC m=+21.276326465 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.134941 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.135242 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.135878 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.135957 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.146997 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.147268 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.147407 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.148232 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.149482 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.150461 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: E0930 07:33:55.154535 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 07:33:55 crc kubenswrapper[4760]: E0930 07:33:55.154575 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 07:33:55 crc kubenswrapper[4760]: E0930 07:33:55.154590 4760 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:33:55 crc kubenswrapper[4760]: E0930 07:33:55.154663 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 07:33:55.654638258 +0000 UTC m=+21.297544670 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.157621 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.157896 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.158521 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.159703 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.160404 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.160919 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.162098 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.162650 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.163582 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.164106 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.164728 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.165337 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.166412 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.167401 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.169057 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.170100 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.175964 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.176552 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.177418 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.178054 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.179121 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.180105 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.181595 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.182060 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.182633 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.183590 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.184087 4760 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.184213 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.185220 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.187110 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.187781 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.188212 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.190282 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.190945 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.191832 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.192488 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.193507 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.193956 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.194607 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.195614 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.196801 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.197478 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.197687 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.198602 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.198660 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.198738 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.198814 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.198842 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.198880 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.198921 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.198939 4760 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.198950 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.198963 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.198975 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.198991 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199003 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199042 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199051 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199060 4760 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199064 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199070 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199163 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199174 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199184 4760 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199193 4760 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199203 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199212 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199221 4760 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199230 4760 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199239 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199248 4760 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199257 4760 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199266 4760 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199277 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199286 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199295 4760 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199322 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199331 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199340 4760 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199349 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199364 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199375 4760 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199386 4760 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199397 4760 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199407 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199418 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199431 4760 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199444 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199457 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199506 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199516 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199525 4760 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199535 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199546 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199557 4760 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199568 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199578 4760 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199589 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199599 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199612 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199625 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199639 4760 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199651 4760 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199663 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199672 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199681 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199692 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199702 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199714 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199745 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199754 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199763 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199774 4760 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199783 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199793 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199803 4760 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199811 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199822 4760 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199845 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199858 4760 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199866 4760 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199874 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199883 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199891 4760 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199899 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199909 4760 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199921 4760 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199933 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199944 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199956 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199968 4760 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199979 4760 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.199990 4760 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.200001 4760 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.200013 4760 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.200024 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.200032 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.200038 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.200130 4760 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.200144 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.200159 4760 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.200171 4760 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.200184 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.200196 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.200207 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.200220 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.200232 4760 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.200244 4760 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.200257 4760 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.200270 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.200280 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.200294 4760 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.200328 4760 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.201315 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.201815 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.202679 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.203193 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.203774 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.204757 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.205292 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.211180 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.212717 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.223752 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.227193 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 07:33:55 crc kubenswrapper[4760]: W0930 07:33:55.228159 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-67d1a9e137c5dd4b9aa873fa3adc3ef3f84e1ffd9a33557ad5b59cb3ce7831f4 WatchSource:0}: Error finding container 67d1a9e137c5dd4b9aa873fa3adc3ef3f84e1ffd9a33557ad5b59cb3ce7831f4: Status 404 returned error can't find the container with id 67d1a9e137c5dd4b9aa873fa3adc3ef3f84e1ffd9a33557ad5b59cb3ce7831f4 Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.239966 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.241566 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.274642 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.304343 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.323348 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.339442 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.360937 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.426788 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.442421 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.454567 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.472311 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.486542 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.603898 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.603968 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.604002 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:33:55 crc kubenswrapper[4760]: E0930 07:33:55.604078 4760 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 07:33:55 crc kubenswrapper[4760]: E0930 07:33:55.604126 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 07:33:56.604114238 +0000 UTC m=+22.247020650 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 07:33:55 crc kubenswrapper[4760]: E0930 07:33:55.604197 4760 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 07:33:55 crc kubenswrapper[4760]: E0930 07:33:55.604229 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 07:33:56.604222911 +0000 UTC m=+22.247129323 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 07:33:55 crc kubenswrapper[4760]: E0930 07:33:55.604348 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:33:56.604287603 +0000 UTC m=+22.247194015 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.704911 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.705016 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:33:55 crc kubenswrapper[4760]: E0930 07:33:55.705147 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 07:33:55 crc kubenswrapper[4760]: E0930 07:33:55.705198 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 07:33:55 crc kubenswrapper[4760]: E0930 07:33:55.705214 4760 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:33:55 crc kubenswrapper[4760]: E0930 07:33:55.705221 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 07:33:55 crc kubenswrapper[4760]: E0930 07:33:55.705244 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 07:33:55 crc kubenswrapper[4760]: E0930 07:33:55.705259 4760 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:33:55 crc kubenswrapper[4760]: E0930 07:33:55.705331 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 07:33:56.705282438 +0000 UTC m=+22.348188870 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:33:55 crc kubenswrapper[4760]: E0930 07:33:55.705361 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 07:33:56.70535018 +0000 UTC m=+22.348256602 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.872132 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.885601 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.891296 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.906287 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.917351 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.926833 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.938699 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.948579 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.952487 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.965444 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.978213 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.987995 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:55 crc kubenswrapper[4760]: I0930 07:33:55.997769 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.019639 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.032958 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.042923 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.052898 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.066490 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.066594 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.066617 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:33:56 crc kubenswrapper[4760]: E0930 07:33:56.066728 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:33:56 crc kubenswrapper[4760]: E0930 07:33:56.066840 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:33:56 crc kubenswrapper[4760]: E0930 07:33:56.067067 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.068556 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.085869 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.105601 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.159337 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-sv6wk"] Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.159843 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sv6wk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.162321 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.162607 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.163116 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.163492 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-f2lrk"] Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.164208 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.176258 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.176260 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.179490 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.180023 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.180101 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.183572 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.209959 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.210227 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4a8035f8-210d-4a09-bca5-274ced93774c-hosts-file\") pod \"node-resolver-sv6wk\" (UID: \"4a8035f8-210d-4a09-bca5-274ced93774c\") " pod="openshift-dns/node-resolver-sv6wk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.210260 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pk8k\" (UniqueName: \"kubernetes.io/projected/4a8035f8-210d-4a09-bca5-274ced93774c-kube-api-access-6pk8k\") pod \"node-resolver-sv6wk\" (UID: \"4a8035f8-210d-4a09-bca5-274ced93774c\") " pod="openshift-dns/node-resolver-sv6wk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.213882 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b"} Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.213950 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9"} Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.213968 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8b3ce2b7979751ce97068f314351d59fa535199149f117921551d618baced10e"} Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.215775 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132"} Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.215839 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"67d1a9e137c5dd4b9aa873fa3adc3ef3f84e1ffd9a33557ad5b59cb3ce7831f4"} Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.217075 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b51d5b2acba14d0cbc9d5db92c4ca7d9c748e4a7dc18986ccdce76abd55378dd"} Sep 30 07:33:56 crc kubenswrapper[4760]: E0930 07:33:56.241805 4760 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.244529 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.258018 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.280746 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.299558 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.310793 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.311113 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a9c8270-6964-4886-87d0-227b05b76da4-proxy-tls\") pod \"machine-config-daemon-f2lrk\" (UID: \"7a9c8270-6964-4886-87d0-227b05b76da4\") " pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.311171 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4a8035f8-210d-4a09-bca5-274ced93774c-hosts-file\") pod \"node-resolver-sv6wk\" (UID: \"4a8035f8-210d-4a09-bca5-274ced93774c\") " pod="openshift-dns/node-resolver-sv6wk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.311218 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7a9c8270-6964-4886-87d0-227b05b76da4-rootfs\") pod \"machine-config-daemon-f2lrk\" (UID: \"7a9c8270-6964-4886-87d0-227b05b76da4\") " pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.311248 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pk8k\" (UniqueName: \"kubernetes.io/projected/4a8035f8-210d-4a09-bca5-274ced93774c-kube-api-access-6pk8k\") pod \"node-resolver-sv6wk\" (UID: \"4a8035f8-210d-4a09-bca5-274ced93774c\") " pod="openshift-dns/node-resolver-sv6wk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.311310 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4a8035f8-210d-4a09-bca5-274ced93774c-hosts-file\") pod \"node-resolver-sv6wk\" (UID: \"4a8035f8-210d-4a09-bca5-274ced93774c\") " pod="openshift-dns/node-resolver-sv6wk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.311345 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7a9c8270-6964-4886-87d0-227b05b76da4-mcd-auth-proxy-config\") pod \"machine-config-daemon-f2lrk\" (UID: \"7a9c8270-6964-4886-87d0-227b05b76da4\") " pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.311373 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cctgt\" (UniqueName: \"kubernetes.io/projected/7a9c8270-6964-4886-87d0-227b05b76da4-kube-api-access-cctgt\") pod \"machine-config-daemon-f2lrk\" (UID: \"7a9c8270-6964-4886-87d0-227b05b76da4\") " pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.324589 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.330278 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pk8k\" (UniqueName: \"kubernetes.io/projected/4a8035f8-210d-4a09-bca5-274ced93774c-kube-api-access-6pk8k\") pod \"node-resolver-sv6wk\" (UID: \"4a8035f8-210d-4a09-bca5-274ced93774c\") " pod="openshift-dns/node-resolver-sv6wk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.336029 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.347764 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.360810 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.374036 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.395770 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.409749 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.412208 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7a9c8270-6964-4886-87d0-227b05b76da4-rootfs\") pod \"machine-config-daemon-f2lrk\" (UID: \"7a9c8270-6964-4886-87d0-227b05b76da4\") " pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.412379 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7a9c8270-6964-4886-87d0-227b05b76da4-rootfs\") pod \"machine-config-daemon-f2lrk\" (UID: \"7a9c8270-6964-4886-87d0-227b05b76da4\") " pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.412415 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7a9c8270-6964-4886-87d0-227b05b76da4-mcd-auth-proxy-config\") pod \"machine-config-daemon-f2lrk\" (UID: \"7a9c8270-6964-4886-87d0-227b05b76da4\") " pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.412486 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cctgt\" (UniqueName: \"kubernetes.io/projected/7a9c8270-6964-4886-87d0-227b05b76da4-kube-api-access-cctgt\") pod \"machine-config-daemon-f2lrk\" (UID: \"7a9c8270-6964-4886-87d0-227b05b76da4\") " pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.412578 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a9c8270-6964-4886-87d0-227b05b76da4-proxy-tls\") pod \"machine-config-daemon-f2lrk\" (UID: \"7a9c8270-6964-4886-87d0-227b05b76da4\") " pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.413356 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7a9c8270-6964-4886-87d0-227b05b76da4-mcd-auth-proxy-config\") pod \"machine-config-daemon-f2lrk\" (UID: \"7a9c8270-6964-4886-87d0-227b05b76da4\") " pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.417124 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a9c8270-6964-4886-87d0-227b05b76da4-proxy-tls\") pod \"machine-config-daemon-f2lrk\" (UID: \"7a9c8270-6964-4886-87d0-227b05b76da4\") " pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.421514 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.435553 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cctgt\" (UniqueName: \"kubernetes.io/projected/7a9c8270-6964-4886-87d0-227b05b76da4-kube-api-access-cctgt\") pod \"machine-config-daemon-f2lrk\" (UID: \"7a9c8270-6964-4886-87d0-227b05b76da4\") " pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.437569 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.451122 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.466388 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.477635 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sv6wk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.477687 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.484689 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 07:33:56 crc kubenswrapper[4760]: W0930 07:33:56.502241 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a8035f8_210d_4a09_bca5_274ced93774c.slice/crio-42febe0a97b0a75d58a1f977cc6321d3ac5f7ceac83c278688a4f8ae98a42115 WatchSource:0}: Error finding container 42febe0a97b0a75d58a1f977cc6321d3ac5f7ceac83c278688a4f8ae98a42115: Status 404 returned error can't find the container with id 42febe0a97b0a75d58a1f977cc6321d3ac5f7ceac83c278688a4f8ae98a42115 Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.512549 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.532032 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.585484 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sspvl"] Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.586523 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-6vfjl"] Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.589163 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-lvdpk"] Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.591332 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.592678 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.595468 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.598360 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.598642 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.598385 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.600388 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.603823 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.604108 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.604247 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.604444 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.604459 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.604480 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.604670 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.604792 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.604909 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.604966 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.613452 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.613577 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.613649 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:33:56 crc kubenswrapper[4760]: E0930 07:33:56.613724 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:33:58.613697719 +0000 UTC m=+24.256604131 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:33:56 crc kubenswrapper[4760]: E0930 07:33:56.613776 4760 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 07:33:56 crc kubenswrapper[4760]: E0930 07:33:56.613829 4760 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 07:33:56 crc kubenswrapper[4760]: E0930 07:33:56.613859 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 07:33:58.613836633 +0000 UTC m=+24.256743045 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 07:33:56 crc kubenswrapper[4760]: E0930 07:33:56.613878 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 07:33:58.613867224 +0000 UTC m=+24.256773636 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.617233 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.641222 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.658056 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.672674 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.686684 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.711903 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.715121 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-cni-binary-copy\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.715218 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-host-var-lib-cni-multus\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.715336 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aade7c8e-aa34-4b19-9000-d724950a70d7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6vfjl\" (UID: \"aade7c8e-aa34-4b19-9000-d724950a70d7\") " pod="openshift-multus/multus-additional-cni-plugins-6vfjl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.715459 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-var-lib-openvswitch\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.715525 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-run-openvswitch\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.715551 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-multus-cni-dir\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.715613 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-cni-netd\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.715699 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aade7c8e-aa34-4b19-9000-d724950a70d7-cnibin\") pod \"multus-additional-cni-plugins-6vfjl\" (UID: \"aade7c8e-aa34-4b19-9000-d724950a70d7\") " pod="openshift-multus/multus-additional-cni-plugins-6vfjl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.715727 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c4ca8ea-a714-40e5-9e10-080aef32237b-env-overrides\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.715785 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-run-systemd\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.715859 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-cni-bin\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.715904 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.715963 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-run-ovn-kubernetes\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.716023 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-system-cni-dir\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.716059 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-cnibin\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.716120 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-etc-kubernetes\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.716147 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-host-var-lib-kubelet\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: E0930 07:33:56.716248 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.716263 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-run-netns\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.716334 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c4ca8ea-a714-40e5-9e10-080aef32237b-ovnkube-config\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.716364 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-multus-socket-dir-parent\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: E0930 07:33:56.716281 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 07:33:56 crc kubenswrapper[4760]: E0930 07:33:56.716423 4760 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.716435 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-node-log\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.716469 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-systemd-units\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.716498 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-slash\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.716539 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-os-release\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.716566 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-host-run-k8s-cni-cncf-io\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: E0930 07:33:56.716586 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 07:33:58.716563962 +0000 UTC m=+24.359470374 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.716637 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.716670 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aade7c8e-aa34-4b19-9000-d724950a70d7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6vfjl\" (UID: \"aade7c8e-aa34-4b19-9000-d724950a70d7\") " pod="openshift-multus/multus-additional-cni-plugins-6vfjl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.716700 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-run-ovn\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.716725 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-log-socket\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.716753 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lllfc\" (UniqueName: \"kubernetes.io/projected/2c4ca8ea-a714-40e5-9e10-080aef32237b-kube-api-access-lllfc\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.716783 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aade7c8e-aa34-4b19-9000-d724950a70d7-os-release\") pod \"multus-additional-cni-plugins-6vfjl\" (UID: \"aade7c8e-aa34-4b19-9000-d724950a70d7\") " pod="openshift-multus/multus-additional-cni-plugins-6vfjl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.716808 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-kubelet\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.716884 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aade7c8e-aa34-4b19-9000-d724950a70d7-cni-binary-copy\") pod \"multus-additional-cni-plugins-6vfjl\" (UID: \"aade7c8e-aa34-4b19-9000-d724950a70d7\") " pod="openshift-multus/multus-additional-cni-plugins-6vfjl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.716954 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2c4ca8ea-a714-40e5-9e10-080aef32237b-ovnkube-script-lib\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: E0930 07:33:56.716976 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 07:33:56 crc kubenswrapper[4760]: E0930 07:33:56.717016 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.717008 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-hostroot\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: E0930 07:33:56.717036 4760 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.717102 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aade7c8e-aa34-4b19-9000-d724950a70d7-system-cni-dir\") pod \"multus-additional-cni-plugins-6vfjl\" (UID: \"aade7c8e-aa34-4b19-9000-d724950a70d7\") " pod="openshift-multus/multus-additional-cni-plugins-6vfjl" Sep 30 07:33:56 crc kubenswrapper[4760]: E0930 07:33:56.717151 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 07:33:58.717125157 +0000 UTC m=+24.360031799 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.717189 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-host-run-multus-certs\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.717283 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c4ca8ea-a714-40e5-9e10-080aef32237b-ovn-node-metrics-cert\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.717375 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-multus-daemon-config\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.717436 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-etc-openvswitch\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.717461 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-host-run-netns\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.717541 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-multus-conf-dir\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.717605 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g8kp\" (UniqueName: \"kubernetes.io/projected/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-kube-api-access-8g8kp\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.717634 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df7r7\" (UniqueName: \"kubernetes.io/projected/aade7c8e-aa34-4b19-9000-d724950a70d7-kube-api-access-df7r7\") pod \"multus-additional-cni-plugins-6vfjl\" (UID: \"aade7c8e-aa34-4b19-9000-d724950a70d7\") " pod="openshift-multus/multus-additional-cni-plugins-6vfjl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.717692 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.717722 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-host-var-lib-cni-bin\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.749169 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.765684 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.782237 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.794577 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.807457 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819191 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-slash\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819241 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-os-release\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819265 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-host-run-k8s-cni-cncf-io\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819293 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-systemd-units\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819331 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aade7c8e-aa34-4b19-9000-d724950a70d7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6vfjl\" (UID: \"aade7c8e-aa34-4b19-9000-d724950a70d7\") " pod="openshift-multus/multus-additional-cni-plugins-6vfjl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819351 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-run-ovn\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819347 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-slash\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819371 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-log-socket\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819407 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-log-socket\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819422 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-os-release\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819438 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-run-ovn\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819390 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-host-run-k8s-cni-cncf-io\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819460 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-systemd-units\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819442 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lllfc\" (UniqueName: \"kubernetes.io/projected/2c4ca8ea-a714-40e5-9e10-080aef32237b-kube-api-access-lllfc\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819524 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-kubelet\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819547 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-kubelet\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819547 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aade7c8e-aa34-4b19-9000-d724950a70d7-os-release\") pod \"multus-additional-cni-plugins-6vfjl\" (UID: \"aade7c8e-aa34-4b19-9000-d724950a70d7\") " pod="openshift-multus/multus-additional-cni-plugins-6vfjl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819583 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2c4ca8ea-a714-40e5-9e10-080aef32237b-ovnkube-script-lib\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819634 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-hostroot\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819663 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aade7c8e-aa34-4b19-9000-d724950a70d7-cni-binary-copy\") pod \"multus-additional-cni-plugins-6vfjl\" (UID: \"aade7c8e-aa34-4b19-9000-d724950a70d7\") " pod="openshift-multus/multus-additional-cni-plugins-6vfjl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819678 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-hostroot\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819686 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aade7c8e-aa34-4b19-9000-d724950a70d7-system-cni-dir\") pod \"multus-additional-cni-plugins-6vfjl\" (UID: \"aade7c8e-aa34-4b19-9000-d724950a70d7\") " pod="openshift-multus/multus-additional-cni-plugins-6vfjl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819585 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aade7c8e-aa34-4b19-9000-d724950a70d7-os-release\") pod \"multus-additional-cni-plugins-6vfjl\" (UID: \"aade7c8e-aa34-4b19-9000-d724950a70d7\") " pod="openshift-multus/multus-additional-cni-plugins-6vfjl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819731 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c4ca8ea-a714-40e5-9e10-080aef32237b-ovn-node-metrics-cert\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819756 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-multus-daemon-config\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819777 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-host-run-multus-certs\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819799 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-multus-conf-dir\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819828 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g8kp\" (UniqueName: \"kubernetes.io/projected/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-kube-api-access-8g8kp\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819858 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-etc-openvswitch\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819880 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-host-run-netns\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819900 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819920 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-host-var-lib-cni-bin\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819939 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df7r7\" (UniqueName: \"kubernetes.io/projected/aade7c8e-aa34-4b19-9000-d724950a70d7-kube-api-access-df7r7\") pod \"multus-additional-cni-plugins-6vfjl\" (UID: \"aade7c8e-aa34-4b19-9000-d724950a70d7\") " pod="openshift-multus/multus-additional-cni-plugins-6vfjl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819959 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-cni-binary-copy\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819976 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-host-var-lib-cni-multus\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.819994 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aade7c8e-aa34-4b19-9000-d724950a70d7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6vfjl\" (UID: \"aade7c8e-aa34-4b19-9000-d724950a70d7\") " pod="openshift-multus/multus-additional-cni-plugins-6vfjl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.820018 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-var-lib-openvswitch\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.820037 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-run-openvswitch\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.820057 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-multus-cni-dir\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.820085 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aade7c8e-aa34-4b19-9000-d724950a70d7-cnibin\") pod \"multus-additional-cni-plugins-6vfjl\" (UID: \"aade7c8e-aa34-4b19-9000-d724950a70d7\") " pod="openshift-multus/multus-additional-cni-plugins-6vfjl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.820104 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-cni-netd\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.820125 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c4ca8ea-a714-40e5-9e10-080aef32237b-env-overrides\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.820142 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-cni-bin\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.820164 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-run-systemd\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.820186 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-system-cni-dir\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.820222 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-cnibin\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.820238 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-etc-kubernetes\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.820239 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aade7c8e-aa34-4b19-9000-d724950a70d7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6vfjl\" (UID: \"aade7c8e-aa34-4b19-9000-d724950a70d7\") " pod="openshift-multus/multus-additional-cni-plugins-6vfjl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.820285 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-run-ovn-kubernetes\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.820263 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-run-ovn-kubernetes\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.820330 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aade7c8e-aa34-4b19-9000-d724950a70d7-system-cni-dir\") pod \"multus-additional-cni-plugins-6vfjl\" (UID: \"aade7c8e-aa34-4b19-9000-d724950a70d7\") " pod="openshift-multus/multus-additional-cni-plugins-6vfjl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.820356 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-host-var-lib-kubelet\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.820388 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-multus-socket-dir-parent\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.820412 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-run-netns\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.820429 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c4ca8ea-a714-40e5-9e10-080aef32237b-ovnkube-config\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.820451 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-node-log\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.820530 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-node-log\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.820535 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2c4ca8ea-a714-40e5-9e10-080aef32237b-ovnkube-script-lib\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.820555 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-host-var-lib-kubelet\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.820576 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aade7c8e-aa34-4b19-9000-d724950a70d7-cni-binary-copy\") pod \"multus-additional-cni-plugins-6vfjl\" (UID: \"aade7c8e-aa34-4b19-9000-d724950a70d7\") " pod="openshift-multus/multus-additional-cni-plugins-6vfjl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.820603 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-multus-socket-dir-parent\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.820583 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-run-netns\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.820653 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-host-run-netns\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.821220 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aade7c8e-aa34-4b19-9000-d724950a70d7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6vfjl\" (UID: \"aade7c8e-aa34-4b19-9000-d724950a70d7\") " pod="openshift-multus/multus-additional-cni-plugins-6vfjl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.821278 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.821288 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c4ca8ea-a714-40e5-9e10-080aef32237b-ovnkube-config\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.821320 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-var-lib-openvswitch\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.821337 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-multus-daemon-config\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.821376 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-host-var-lib-cni-multus\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.821423 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-host-run-multus-certs\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.821474 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-etc-openvswitch\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.821508 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-multus-conf-dir\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.821583 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-cni-bin\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.821609 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aade7c8e-aa34-4b19-9000-d724950a70d7-cnibin\") pod \"multus-additional-cni-plugins-6vfjl\" (UID: \"aade7c8e-aa34-4b19-9000-d724950a70d7\") " pod="openshift-multus/multus-additional-cni-plugins-6vfjl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.821633 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-cni-netd\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.821649 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-run-openvswitch\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.821718 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-run-systemd\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.821713 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-multus-cni-dir\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.821776 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-etc-kubernetes\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.821788 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-cni-binary-copy\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.821330 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-host-var-lib-cni-bin\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.821814 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-cnibin\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.821946 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-system-cni-dir\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.822101 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c4ca8ea-a714-40e5-9e10-080aef32237b-env-overrides\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.822974 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.824406 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c4ca8ea-a714-40e5-9e10-080aef32237b-ovn-node-metrics-cert\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.837186 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df7r7\" (UniqueName: \"kubernetes.io/projected/aade7c8e-aa34-4b19-9000-d724950a70d7-kube-api-access-df7r7\") pod \"multus-additional-cni-plugins-6vfjl\" (UID: \"aade7c8e-aa34-4b19-9000-d724950a70d7\") " pod="openshift-multus/multus-additional-cni-plugins-6vfjl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.840278 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.840597 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g8kp\" (UniqueName: \"kubernetes.io/projected/f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e-kube-api-access-8g8kp\") pod \"multus-lvdpk\" (UID: \"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\") " pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.845511 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lllfc\" (UniqueName: \"kubernetes.io/projected/2c4ca8ea-a714-40e5-9e10-080aef32237b-kube-api-access-lllfc\") pod \"ovnkube-node-sspvl\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.854246 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.873909 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.886635 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.899207 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.916788 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.919117 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.927626 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lvdpk" Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.936657 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" Sep 30 07:33:56 crc kubenswrapper[4760]: W0930 07:33:56.939392 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c4ca8ea_a714_40e5_9e10_080aef32237b.slice/crio-c846605e3915d29b64cf72abf9f7642ff689799c187453d57ca92d5f01aeea8e WatchSource:0}: Error finding container c846605e3915d29b64cf72abf9f7642ff689799c187453d57ca92d5f01aeea8e: Status 404 returned error can't find the container with id c846605e3915d29b64cf72abf9f7642ff689799c187453d57ca92d5f01aeea8e Sep 30 07:33:56 crc kubenswrapper[4760]: W0930 07:33:56.940083 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf50c364e_d22c_4fe5_a0aa_66f4e8d8b21e.slice/crio-b07bfbb38510633b067afe6e4b862c87fe053aeead93f3700980e9b395c32b28 WatchSource:0}: Error finding container b07bfbb38510633b067afe6e4b862c87fe053aeead93f3700980e9b395c32b28: Status 404 returned error can't find the container with id b07bfbb38510633b067afe6e4b862c87fe053aeead93f3700980e9b395c32b28 Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.949374 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:56 crc kubenswrapper[4760]: W0930 07:33:56.951122 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaade7c8e_aa34_4b19_9000_d724950a70d7.slice/crio-381f85fa0d186163c8cf2c1ba3e9ba285d89938e71f8dd357f013341a3b08e86 WatchSource:0}: Error finding container 381f85fa0d186163c8cf2c1ba3e9ba285d89938e71f8dd357f013341a3b08e86: Status 404 returned error can't find the container with id 381f85fa0d186163c8cf2c1ba3e9ba285d89938e71f8dd357f013341a3b08e86 Sep 30 07:33:56 crc kubenswrapper[4760]: I0930 07:33:56.975615 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:56Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.002529 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:57Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.035351 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:57Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.077441 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:57Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.122916 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:57Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.154111 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:57Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.191796 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:57Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.227844 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lvdpk" event={"ID":"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e","Type":"ContainerStarted","Data":"96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014"} Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.228358 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lvdpk" event={"ID":"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e","Type":"ContainerStarted","Data":"b07bfbb38510633b067afe6e4b862c87fe053aeead93f3700980e9b395c32b28"} Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.232345 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerStarted","Data":"e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9"} Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.232389 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerStarted","Data":"90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7"} Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.232400 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerStarted","Data":"5b839e2aa795fd72c4393e48fc311d34d65571e4f032e469ec2a24a8c34e3f28"} Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.234601 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sv6wk" event={"ID":"4a8035f8-210d-4a09-bca5-274ced93774c","Type":"ContainerStarted","Data":"774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3"} Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.234662 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sv6wk" event={"ID":"4a8035f8-210d-4a09-bca5-274ced93774c","Type":"ContainerStarted","Data":"42febe0a97b0a75d58a1f977cc6321d3ac5f7ceac83c278688a4f8ae98a42115"} Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.235794 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" event={"ID":"aade7c8e-aa34-4b19-9000-d724950a70d7","Type":"ContainerStarted","Data":"381f85fa0d186163c8cf2c1ba3e9ba285d89938e71f8dd357f013341a3b08e86"} Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.242077 4760 generic.go:334] "Generic (PLEG): container finished" podID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerID="1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535" exitCode=0 Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.242176 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" event={"ID":"2c4ca8ea-a714-40e5-9e10-080aef32237b","Type":"ContainerDied","Data":"1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535"} Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.242266 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" event={"ID":"2c4ca8ea-a714-40e5-9e10-080aef32237b","Type":"ContainerStarted","Data":"c846605e3915d29b64cf72abf9f7642ff689799c187453d57ca92d5f01aeea8e"} Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.251272 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:57Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.273262 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:57Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.333856 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:57Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.368862 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:57Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.411078 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:57Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.442973 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:57Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.479164 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:57Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.528758 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:57Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.550417 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:57Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.593058 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:57Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.629005 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:57Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.673042 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:57Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.710592 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:57Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.750536 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:57Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.802124 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:57Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.844412 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:57Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.874993 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:57Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.913314 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:57Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.951710 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:57Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:57 crc kubenswrapper[4760]: I0930 07:33:57.989805 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:57Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.031087 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:58Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.066533 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.066544 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.066652 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:33:58 crc kubenswrapper[4760]: E0930 07:33:58.066692 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:33:58 crc kubenswrapper[4760]: E0930 07:33:58.066716 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:33:58 crc kubenswrapper[4760]: E0930 07:33:58.066906 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.071505 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:58Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.112755 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:58Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.163450 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:58Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.195009 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:58Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.234837 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:58Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.247633 4760 generic.go:334] "Generic (PLEG): container finished" podID="aade7c8e-aa34-4b19-9000-d724950a70d7" containerID="e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e" exitCode=0 Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.247754 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" event={"ID":"aade7c8e-aa34-4b19-9000-d724950a70d7","Type":"ContainerDied","Data":"e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e"} Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.257740 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" event={"ID":"2c4ca8ea-a714-40e5-9e10-080aef32237b","Type":"ContainerStarted","Data":"9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379"} Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.257809 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" event={"ID":"2c4ca8ea-a714-40e5-9e10-080aef32237b","Type":"ContainerStarted","Data":"04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5"} Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.257823 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" event={"ID":"2c4ca8ea-a714-40e5-9e10-080aef32237b","Type":"ContainerStarted","Data":"45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45"} Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.257834 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" event={"ID":"2c4ca8ea-a714-40e5-9e10-080aef32237b","Type":"ContainerStarted","Data":"ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185"} Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.282024 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:58Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.315029 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:58Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.358080 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:58Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.393783 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:58Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.435655 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:58Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.471471 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:58Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.515922 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:58Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.554233 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:58Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.592932 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:58Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.633841 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:58Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.639182 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.639389 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.639454 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:33:58 crc kubenswrapper[4760]: E0930 07:33:58.639550 4760 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 07:33:58 crc kubenswrapper[4760]: E0930 07:33:58.639630 4760 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 07:33:58 crc kubenswrapper[4760]: E0930 07:33:58.639648 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:34:02.639534835 +0000 UTC m=+28.282441287 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:33:58 crc kubenswrapper[4760]: E0930 07:33:58.639729 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 07:34:02.639703849 +0000 UTC m=+28.282610261 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 07:33:58 crc kubenswrapper[4760]: E0930 07:33:58.639843 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 07:34:02.639831232 +0000 UTC m=+28.282737644 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.682511 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:58Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.713833 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:58Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.741182 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.741234 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:33:58 crc kubenswrapper[4760]: E0930 07:33:58.741419 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 07:33:58 crc kubenswrapper[4760]: E0930 07:33:58.741440 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 07:33:58 crc kubenswrapper[4760]: E0930 07:33:58.741454 4760 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:33:58 crc kubenswrapper[4760]: E0930 07:33:58.741456 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 07:33:58 crc kubenswrapper[4760]: E0930 07:33:58.741487 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 07:33:58 crc kubenswrapper[4760]: E0930 07:33:58.741501 4760 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:33:58 crc kubenswrapper[4760]: E0930 07:33:58.741513 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 07:34:02.741495065 +0000 UTC m=+28.384401477 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:33:58 crc kubenswrapper[4760]: E0930 07:33:58.741571 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 07:34:02.741548796 +0000 UTC m=+28.384455208 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.760407 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:58Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.796084 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:58Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.834047 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:58Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.870862 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:58Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.890471 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-rpvhp"] Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.890978 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rpvhp" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.901882 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.921688 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.941580 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.962012 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Sep 30 07:33:58 crc kubenswrapper[4760]: I0930 07:33:58.989443 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:58Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.034765 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:59Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.042931 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j7qm\" (UniqueName: \"kubernetes.io/projected/a8530550-438d-46a5-aa3f-4b10838396f1-kube-api-access-7j7qm\") pod \"node-ca-rpvhp\" (UID: \"a8530550-438d-46a5-aa3f-4b10838396f1\") " pod="openshift-image-registry/node-ca-rpvhp" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.043015 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a8530550-438d-46a5-aa3f-4b10838396f1-host\") pod \"node-ca-rpvhp\" (UID: \"a8530550-438d-46a5-aa3f-4b10838396f1\") " pod="openshift-image-registry/node-ca-rpvhp" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.043037 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a8530550-438d-46a5-aa3f-4b10838396f1-serviceca\") pod \"node-ca-rpvhp\" (UID: \"a8530550-438d-46a5-aa3f-4b10838396f1\") " pod="openshift-image-registry/node-ca-rpvhp" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.073198 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:59Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.112519 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:59Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.144501 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a8530550-438d-46a5-aa3f-4b10838396f1-host\") pod \"node-ca-rpvhp\" (UID: \"a8530550-438d-46a5-aa3f-4b10838396f1\") " pod="openshift-image-registry/node-ca-rpvhp" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.144560 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a8530550-438d-46a5-aa3f-4b10838396f1-serviceca\") pod \"node-ca-rpvhp\" (UID: \"a8530550-438d-46a5-aa3f-4b10838396f1\") " pod="openshift-image-registry/node-ca-rpvhp" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.144620 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j7qm\" (UniqueName: \"kubernetes.io/projected/a8530550-438d-46a5-aa3f-4b10838396f1-kube-api-access-7j7qm\") pod \"node-ca-rpvhp\" (UID: \"a8530550-438d-46a5-aa3f-4b10838396f1\") " pod="openshift-image-registry/node-ca-rpvhp" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.144652 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a8530550-438d-46a5-aa3f-4b10838396f1-host\") pod \"node-ca-rpvhp\" (UID: \"a8530550-438d-46a5-aa3f-4b10838396f1\") " pod="openshift-image-registry/node-ca-rpvhp" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.146202 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a8530550-438d-46a5-aa3f-4b10838396f1-serviceca\") pod \"node-ca-rpvhp\" (UID: \"a8530550-438d-46a5-aa3f-4b10838396f1\") " pod="openshift-image-registry/node-ca-rpvhp" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.152900 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:59Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.184734 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j7qm\" (UniqueName: \"kubernetes.io/projected/a8530550-438d-46a5-aa3f-4b10838396f1-kube-api-access-7j7qm\") pod \"node-ca-rpvhp\" (UID: \"a8530550-438d-46a5-aa3f-4b10838396f1\") " pod="openshift-image-registry/node-ca-rpvhp" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.203900 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rpvhp" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.216910 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:59Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:59 crc kubenswrapper[4760]: W0930 07:33:59.228924 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8530550_438d_46a5_aa3f_4b10838396f1.slice/crio-b1ccb7e7c8df9d68f8b8fcf27b907d923309bc165810bb9d447b5a6bca887b36 WatchSource:0}: Error finding container b1ccb7e7c8df9d68f8b8fcf27b907d923309bc165810bb9d447b5a6bca887b36: Status 404 returned error can't find the container with id b1ccb7e7c8df9d68f8b8fcf27b907d923309bc165810bb9d447b5a6bca887b36 Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.255119 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:59Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.269837 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" event={"ID":"2c4ca8ea-a714-40e5-9e10-080aef32237b","Type":"ContainerStarted","Data":"0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9"} Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.269892 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" event={"ID":"2c4ca8ea-a714-40e5-9e10-080aef32237b","Type":"ContainerStarted","Data":"bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e"} Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.271144 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222"} Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.274146 4760 generic.go:334] "Generic (PLEG): container finished" podID="aade7c8e-aa34-4b19-9000-d724950a70d7" containerID="7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604" exitCode=0 Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.274186 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" event={"ID":"aade7c8e-aa34-4b19-9000-d724950a70d7","Type":"ContainerDied","Data":"7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604"} Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.275657 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rpvhp" event={"ID":"a8530550-438d-46a5-aa3f-4b10838396f1","Type":"ContainerStarted","Data":"b1ccb7e7c8df9d68f8b8fcf27b907d923309bc165810bb9d447b5a6bca887b36"} Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.300498 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:59Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.332746 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:59Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.376555 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:59Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.414590 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:59Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.462243 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:59Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.491939 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:59Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.532852 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:59Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.587270 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:59Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.610856 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:59Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.654087 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:59Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.704059 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:59Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.739714 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:59Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.781603 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:59Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.818647 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:59Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.861607 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:59Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.914698 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:59Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.939774 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:59Z is after 2025-08-24T17:21:41Z" Sep 30 07:33:59 crc kubenswrapper[4760]: I0930 07:33:59.972489 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:33:59Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.016907 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:00Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.065908 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.065984 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.065988 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:00 crc kubenswrapper[4760]: E0930 07:34:00.066150 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:00 crc kubenswrapper[4760]: E0930 07:34:00.066349 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:00 crc kubenswrapper[4760]: E0930 07:34:00.066472 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.069868 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:00Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.094871 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:00Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.137898 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:00Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.175423 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:00Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.280548 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rpvhp" event={"ID":"a8530550-438d-46a5-aa3f-4b10838396f1","Type":"ContainerStarted","Data":"0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8"} Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.284125 4760 generic.go:334] "Generic (PLEG): container finished" podID="aade7c8e-aa34-4b19-9000-d724950a70d7" containerID="96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f" exitCode=0 Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.284280 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" event={"ID":"aade7c8e-aa34-4b19-9000-d724950a70d7","Type":"ContainerDied","Data":"96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f"} Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.301693 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:00Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.324510 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:00Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.343816 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:00Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.368826 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:00Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.387950 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:00Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.415925 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:00Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.463247 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:00Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.498018 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:00Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.535183 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:00Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.579953 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:00Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.625562 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:00Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.661225 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:00Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.698522 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:00Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.737737 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:00Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.782494 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:00Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.816319 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:00Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.858171 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:00Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.894506 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:00Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.939952 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:00Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:00 crc kubenswrapper[4760]: I0930 07:34:00.976003 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:00Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.015091 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.054980 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.099808 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.134581 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.176428 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.222833 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.252159 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.284696 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.287116 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.287161 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.287178 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.287387 4760 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.291254 4760 generic.go:334] "Generic (PLEG): container finished" podID="aade7c8e-aa34-4b19-9000-d724950a70d7" containerID="db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f" exitCode=0 Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.291294 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" event={"ID":"aade7c8e-aa34-4b19-9000-d724950a70d7","Type":"ContainerDied","Data":"db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f"} Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.295818 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" event={"ID":"2c4ca8ea-a714-40e5-9e10-080aef32237b","Type":"ContainerStarted","Data":"6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0"} Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.297042 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.343672 4760 kubelet_node_status.go:115] "Node was previously registered" node="crc" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.344015 4760 kubelet_node_status.go:79] "Successfully registered node" node="crc" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.345265 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.345327 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.345341 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.345361 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.345371 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:01Z","lastTransitionTime":"2025-09-30T07:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:01 crc kubenswrapper[4760]: E0930 07:34:01.357714 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.367317 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.367342 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.367350 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.367366 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.367378 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:01Z","lastTransitionTime":"2025-09-30T07:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.370567 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: E0930 07:34:01.387426 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.401587 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.401635 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.401646 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.401678 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.401692 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:01Z","lastTransitionTime":"2025-09-30T07:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.417011 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: E0930 07:34:01.417904 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.422032 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.422072 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.422081 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.422135 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.422145 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:01Z","lastTransitionTime":"2025-09-30T07:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:01 crc kubenswrapper[4760]: E0930 07:34:01.433183 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.437271 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.437344 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.437357 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.437380 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.437412 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:01Z","lastTransitionTime":"2025-09-30T07:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.449260 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: E0930 07:34:01.449410 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: E0930 07:34:01.449607 4760 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.451883 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.451921 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.451936 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.451956 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.451969 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:01Z","lastTransitionTime":"2025-09-30T07:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.492721 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.533729 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.554411 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.554452 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.554463 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.554483 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.554496 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:01Z","lastTransitionTime":"2025-09-30T07:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.571669 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.616705 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.650956 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.657211 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.657260 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.657268 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.657286 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.657309 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:01Z","lastTransitionTime":"2025-09-30T07:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.694222 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.742643 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.759868 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.759912 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.759922 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.759940 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.759950 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:01Z","lastTransitionTime":"2025-09-30T07:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.773389 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.816046 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.855684 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.863069 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.863125 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.863134 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.863154 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.863169 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:01Z","lastTransitionTime":"2025-09-30T07:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.893414 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.932227 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.967712 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.967762 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.967778 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.967801 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.967814 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:01Z","lastTransitionTime":"2025-09-30T07:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:01 crc kubenswrapper[4760]: I0930 07:34:01.977015 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:01Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.014869 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:02Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.066478 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.066537 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.066636 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:02 crc kubenswrapper[4760]: E0930 07:34:02.066652 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:02 crc kubenswrapper[4760]: E0930 07:34:02.066744 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:02 crc kubenswrapper[4760]: E0930 07:34:02.066829 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.070835 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.070865 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.070876 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.070892 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.070902 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:02Z","lastTransitionTime":"2025-09-30T07:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.173583 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.173978 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.173990 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.174018 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.174032 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:02Z","lastTransitionTime":"2025-09-30T07:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.276943 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.276986 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.276997 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.277015 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.277027 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:02Z","lastTransitionTime":"2025-09-30T07:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.304616 4760 generic.go:334] "Generic (PLEG): container finished" podID="aade7c8e-aa34-4b19-9000-d724950a70d7" containerID="5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13" exitCode=0 Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.304670 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" event={"ID":"aade7c8e-aa34-4b19-9000-d724950a70d7","Type":"ContainerDied","Data":"5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13"} Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.323076 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:02Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.346759 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:02Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.365984 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:02Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.403723 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:02Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.421643 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.421710 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.421726 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.421755 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.421778 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:02Z","lastTransitionTime":"2025-09-30T07:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.452596 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:02Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.484512 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:02Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.498797 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:02Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.514949 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:02Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.524217 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.524254 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.524267 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.524288 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.524318 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:02Z","lastTransitionTime":"2025-09-30T07:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.529212 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:02Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.545558 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:02Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.568605 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:02Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.593859 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:02Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.610912 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:02Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.626625 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.626666 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.626676 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.626697 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.626711 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:02Z","lastTransitionTime":"2025-09-30T07:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.631616 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:02Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.662862 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:02Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.697504 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.697669 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.697703 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:02 crc kubenswrapper[4760]: E0930 07:34:02.697733 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:34:10.697703181 +0000 UTC m=+36.340609593 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:34:02 crc kubenswrapper[4760]: E0930 07:34:02.697803 4760 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 07:34:02 crc kubenswrapper[4760]: E0930 07:34:02.697884 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 07:34:10.697862886 +0000 UTC m=+36.340769298 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 07:34:02 crc kubenswrapper[4760]: E0930 07:34:02.697885 4760 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 07:34:02 crc kubenswrapper[4760]: E0930 07:34:02.697949 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 07:34:10.697939238 +0000 UTC m=+36.340845650 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.729485 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.729524 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.729535 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.729560 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.729577 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:02Z","lastTransitionTime":"2025-09-30T07:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.799819 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.799876 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:02 crc kubenswrapper[4760]: E0930 07:34:02.800048 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 07:34:02 crc kubenswrapper[4760]: E0930 07:34:02.800050 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 07:34:02 crc kubenswrapper[4760]: E0930 07:34:02.800092 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 07:34:02 crc kubenswrapper[4760]: E0930 07:34:02.800117 4760 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:34:02 crc kubenswrapper[4760]: E0930 07:34:02.800239 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 07:34:10.800174844 +0000 UTC m=+36.443081296 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:34:02 crc kubenswrapper[4760]: E0930 07:34:02.800070 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 07:34:02 crc kubenswrapper[4760]: E0930 07:34:02.801343 4760 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:34:02 crc kubenswrapper[4760]: E0930 07:34:02.801483 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 07:34:10.801428507 +0000 UTC m=+36.444334959 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.833614 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.833654 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.833664 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.833682 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.833695 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:02Z","lastTransitionTime":"2025-09-30T07:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.936080 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.936123 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.936137 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.936154 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:02 crc kubenswrapper[4760]: I0930 07:34:02.936164 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:02Z","lastTransitionTime":"2025-09-30T07:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.038325 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.038363 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.038374 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.038394 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.038407 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:03Z","lastTransitionTime":"2025-09-30T07:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.141022 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.141069 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.141086 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.141107 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.141123 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:03Z","lastTransitionTime":"2025-09-30T07:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.244511 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.244578 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.244592 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.244625 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.244638 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:03Z","lastTransitionTime":"2025-09-30T07:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.314653 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" event={"ID":"2c4ca8ea-a714-40e5-9e10-080aef32237b","Type":"ContainerStarted","Data":"15fa2b49d0420903a20e43cba97d14d07b922b6347396ae58206b8a4ed21c995"} Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.315177 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.321414 4760 generic.go:334] "Generic (PLEG): container finished" podID="aade7c8e-aa34-4b19-9000-d724950a70d7" containerID="5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f" exitCode=0 Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.321490 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" event={"ID":"aade7c8e-aa34-4b19-9000-d724950a70d7","Type":"ContainerDied","Data":"5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f"} Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.335732 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.348136 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.348217 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.348234 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.348279 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.348359 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:03Z","lastTransitionTime":"2025-09-30T07:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.355094 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.359010 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.378526 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.394395 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.431374 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.450619 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.452964 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.453007 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.453022 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.453047 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.453063 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:03Z","lastTransitionTime":"2025-09-30T07:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.466041 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.482746 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.497330 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.513259 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.530075 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.554411 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fa2b49d0420903a20e43cba97d14d07b922b6347396ae58206b8a4ed21c995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.556664 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.556711 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.556726 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.556746 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.556759 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:03Z","lastTransitionTime":"2025-09-30T07:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.578404 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.593927 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.605267 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.627970 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.648969 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.659624 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.659680 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.659698 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.659730 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.659763 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:03Z","lastTransitionTime":"2025-09-30T07:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.670725 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.688118 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.704905 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.722320 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.742712 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.763265 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.763324 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.763335 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.763354 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.763365 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:03Z","lastTransitionTime":"2025-09-30T07:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.766810 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fa2b49d0420903a20e43cba97d14d07b922b6347396ae58206b8a4ed21c995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.781877 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.795802 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.807754 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.822008 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.838931 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.859159 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.866096 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.866156 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.866174 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.866202 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.866220 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:03Z","lastTransitionTime":"2025-09-30T07:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.876470 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.969461 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.969518 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.969530 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.969557 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:03 crc kubenswrapper[4760]: I0930 07:34:03.969577 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:03Z","lastTransitionTime":"2025-09-30T07:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.066106 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.066180 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.066208 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:04 crc kubenswrapper[4760]: E0930 07:34:04.066296 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:04 crc kubenswrapper[4760]: E0930 07:34:04.066507 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:04 crc kubenswrapper[4760]: E0930 07:34:04.066783 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.072554 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.072603 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.072622 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.072649 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.072670 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:04Z","lastTransitionTime":"2025-09-30T07:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.176931 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.176990 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.177004 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.177027 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.177043 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:04Z","lastTransitionTime":"2025-09-30T07:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.280547 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.280630 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.280645 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.280668 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.280685 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:04Z","lastTransitionTime":"2025-09-30T07:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.333271 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.333373 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" event={"ID":"aade7c8e-aa34-4b19-9000-d724950a70d7","Type":"ContainerStarted","Data":"8c7192e617554ba6b5b9533792f166344fbf9681b63e57c180809e721cc18168"} Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.334154 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.362868 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.372883 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.384610 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.385293 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.385364 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.385382 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.385406 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.385432 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:04Z","lastTransitionTime":"2025-09-30T07:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.400622 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.417329 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.438463 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.456540 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.474019 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.488251 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.488353 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.488372 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.488400 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.488419 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:04Z","lastTransitionTime":"2025-09-30T07:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.497439 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.519891 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.547739 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7192e617554ba6b5b9533792f166344fbf9681b63e57c180809e721cc18168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.591324 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.591364 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.591375 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.591392 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.591405 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:04Z","lastTransitionTime":"2025-09-30T07:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.592138 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.608937 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.629090 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.650929 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.683589 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fa2b49d0420903a20e43cba97d14d07b922b6347396ae58206b8a4ed21c995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.695421 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.695488 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.695511 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.695534 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.695552 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:04Z","lastTransitionTime":"2025-09-30T07:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.703198 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.721529 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.735455 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.752193 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.766833 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.789678 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.799675 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.799751 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.799774 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.799805 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.799826 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:04Z","lastTransitionTime":"2025-09-30T07:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.808125 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.835341 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.859542 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.884766 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.902402 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.902458 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.902473 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.902495 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.902512 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:04Z","lastTransitionTime":"2025-09-30T07:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.906573 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.926759 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.951158 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:04 crc kubenswrapper[4760]: I0930 07:34:04.979159 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7192e617554ba6b5b9533792f166344fbf9681b63e57c180809e721cc18168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:04Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.005338 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.005374 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.005386 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.005406 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.005417 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:05Z","lastTransitionTime":"2025-09-30T07:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.027662 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fa2b49d0420903a20e43cba97d14d07b922b6347396ae58206b8a4ed21c995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.109376 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.109550 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.109579 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.109648 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.109672 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:05Z","lastTransitionTime":"2025-09-30T07:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.111644 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7192e617554ba6b5b9533792f166344fbf9681b63e57c180809e721cc18168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.144613 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.180160 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.199863 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.212537 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.212845 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.212916 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.212986 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.213055 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:05Z","lastTransitionTime":"2025-09-30T07:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.214782 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.252221 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.292624 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.315811 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.315852 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.315865 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.315887 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.315901 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:05Z","lastTransitionTime":"2025-09-30T07:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.336461 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.336636 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fa2b49d0420903a20e43cba97d14d07b922b6347396ae58206b8a4ed21c995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.374656 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.414210 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.417819 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.417885 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.417901 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.417925 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.417946 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:05Z","lastTransitionTime":"2025-09-30T07:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.453995 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.493180 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.520440 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.520485 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.520497 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.520520 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.520532 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:05Z","lastTransitionTime":"2025-09-30T07:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.534245 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.575322 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.612370 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.623677 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.623717 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.623729 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.623750 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.623766 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:05Z","lastTransitionTime":"2025-09-30T07:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.726727 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.726781 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.726793 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.726818 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.726833 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:05Z","lastTransitionTime":"2025-09-30T07:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.830723 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.830792 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.830830 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.830860 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.830890 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:05Z","lastTransitionTime":"2025-09-30T07:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.934668 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.934727 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.934747 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.934778 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:05 crc kubenswrapper[4760]: I0930 07:34:05.934799 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:05Z","lastTransitionTime":"2025-09-30T07:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.037948 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.038006 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.038027 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.038055 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.038077 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:06Z","lastTransitionTime":"2025-09-30T07:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.066959 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.066998 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.067112 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:06 crc kubenswrapper[4760]: E0930 07:34:06.067205 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:06 crc kubenswrapper[4760]: E0930 07:34:06.067358 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:06 crc kubenswrapper[4760]: E0930 07:34:06.067515 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.141656 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.141723 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.141747 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.141782 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.141807 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:06Z","lastTransitionTime":"2025-09-30T07:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.245342 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.245413 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.245433 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.245465 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.245484 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:06Z","lastTransitionTime":"2025-09-30T07:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.343888 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sspvl_2c4ca8ea-a714-40e5-9e10-080aef32237b/ovnkube-controller/0.log" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.347545 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.347606 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.347632 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.347664 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.347685 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:06Z","lastTransitionTime":"2025-09-30T07:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.348621 4760 generic.go:334] "Generic (PLEG): container finished" podID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerID="15fa2b49d0420903a20e43cba97d14d07b922b6347396ae58206b8a4ed21c995" exitCode=1 Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.348677 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" event={"ID":"2c4ca8ea-a714-40e5-9e10-080aef32237b","Type":"ContainerDied","Data":"15fa2b49d0420903a20e43cba97d14d07b922b6347396ae58206b8a4ed21c995"} Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.350445 4760 scope.go:117] "RemoveContainer" containerID="15fa2b49d0420903a20e43cba97d14d07b922b6347396ae58206b8a4ed21c995" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.365878 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:06Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.392381 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:06Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.407693 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:06Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.430251 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:06Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.451485 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.451556 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.451581 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.451617 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.451645 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:06Z","lastTransitionTime":"2025-09-30T07:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.458173 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:06Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.479191 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:06Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.501882 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:06Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.524753 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:06Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.548983 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:06Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.554946 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.555004 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.555023 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.555050 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.555072 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:06Z","lastTransitionTime":"2025-09-30T07:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.579067 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7192e617554ba6b5b9533792f166344fbf9681b63e57c180809e721cc18168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:06Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.617678 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:06Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.655440 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fa2b49d0420903a20e43cba97d14d07b922b6347396ae58206b8a4ed21c995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15fa2b49d0420903a20e43cba97d14d07b922b6347396ae58206b8a4ed21c995\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:05Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 07:34:05.894520 6019 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 07:34:05.894613 6019 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 07:34:05.894652 6019 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 07:34:05.894682 6019 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 07:34:05.894695 6019 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 07:34:05.894729 6019 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 07:34:05.894747 6019 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 07:34:05.894747 6019 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 07:34:05.894745 6019 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 07:34:05.894787 6019 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 07:34:05.894795 6019 factory.go:656] Stopping watch factory\\\\nI0930 07:34:05.894819 6019 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 07:34:05.894819 6019 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 07:34:05.894837 6019 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 07:34:05.894846 6019 ovnkube.go:599] Stopped ovnkube\\\\nI0930 07:34:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:06Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.657898 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.657937 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.657949 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.657970 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.657980 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:06Z","lastTransitionTime":"2025-09-30T07:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.675482 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:06Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.699116 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:06Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.721747 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:06Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.761324 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.761392 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.761412 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.761444 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.761466 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:06Z","lastTransitionTime":"2025-09-30T07:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.864238 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.864280 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.864289 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.864319 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.864331 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:06Z","lastTransitionTime":"2025-09-30T07:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.967057 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.967116 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.967138 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.967163 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:06 crc kubenswrapper[4760]: I0930 07:34:06.967178 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:06Z","lastTransitionTime":"2025-09-30T07:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.073191 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.073243 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.073254 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.073276 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.073290 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:07Z","lastTransitionTime":"2025-09-30T07:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.176637 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.176684 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.176696 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.176717 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.176731 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:07Z","lastTransitionTime":"2025-09-30T07:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.279518 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.279558 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.279570 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.279590 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.279604 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:07Z","lastTransitionTime":"2025-09-30T07:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.354847 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sspvl_2c4ca8ea-a714-40e5-9e10-080aef32237b/ovnkube-controller/1.log" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.355737 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sspvl_2c4ca8ea-a714-40e5-9e10-080aef32237b/ovnkube-controller/0.log" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.359007 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" event={"ID":"2c4ca8ea-a714-40e5-9e10-080aef32237b","Type":"ContainerStarted","Data":"e2eb770a63f2bb8de1ec68ee527d04f64bb32a66016581c3acfc665d9bb60784"} Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.359862 4760 scope.go:117] "RemoveContainer" containerID="e2eb770a63f2bb8de1ec68ee527d04f64bb32a66016581c3acfc665d9bb60784" Sep 30 07:34:07 crc kubenswrapper[4760]: E0930 07:34:07.360060 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-sspvl_openshift-ovn-kubernetes(2c4ca8ea-a714-40e5-9e10-080aef32237b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.375014 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:07Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.382291 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.382340 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.382351 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.382370 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.382381 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:07Z","lastTransitionTime":"2025-09-30T07:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.392702 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:07Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.411725 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:07Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.428210 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:07Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.448834 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7192e617554ba6b5b9533792f166344fbf9681b63e57c180809e721cc18168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:07Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.476411 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:07Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.485418 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.485471 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.485486 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.485507 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.485522 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:07Z","lastTransitionTime":"2025-09-30T07:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.494986 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:07Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.512155 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:07Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.529144 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:07Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.545655 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:07Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.588389 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.588446 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.588462 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.588485 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.588500 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:07Z","lastTransitionTime":"2025-09-30T07:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.596958 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:07Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.632953 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eb770a63f2bb8de1ec68ee527d04f64bb32a66016581c3acfc665d9bb60784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15fa2b49d0420903a20e43cba97d14d07b922b6347396ae58206b8a4ed21c995\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:05Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 07:34:05.894520 6019 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 07:34:05.894613 6019 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 07:34:05.894652 6019 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 07:34:05.894682 6019 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 07:34:05.894695 6019 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 07:34:05.894729 6019 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 07:34:05.894747 6019 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 07:34:05.894747 6019 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 07:34:05.894745 6019 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 07:34:05.894787 6019 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 07:34:05.894795 6019 factory.go:656] Stopping watch factory\\\\nI0930 07:34:05.894819 6019 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 07:34:05.894819 6019 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 07:34:05.894837 6019 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 07:34:05.894846 6019 ovnkube.go:599] Stopped ovnkube\\\\nI0930 07:34:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eb770a63f2bb8de1ec68ee527d04f64bb32a66016581c3acfc665d9bb60784\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:07Z\\\",\\\"message\\\":\\\"vices.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-ingress/router-internal-default_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress/router-internal-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:1936, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 07:34:07.323120 6145 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0930 07:34:07.323753 6145 ovn.go:134] Ensuring zone \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:07Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.647694 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:07Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.659382 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:07Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.670981 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:07Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.691622 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.691667 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.691677 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.691693 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.691704 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:07Z","lastTransitionTime":"2025-09-30T07:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.794667 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.794747 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.794774 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.794810 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.794837 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:07Z","lastTransitionTime":"2025-09-30T07:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.898150 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.898189 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.898200 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.898219 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:07 crc kubenswrapper[4760]: I0930 07:34:07.898230 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:07Z","lastTransitionTime":"2025-09-30T07:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.000708 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.000774 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.000796 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.000823 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.000843 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:08Z","lastTransitionTime":"2025-09-30T07:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.066802 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.066846 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.066920 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:08 crc kubenswrapper[4760]: E0930 07:34:08.066979 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:08 crc kubenswrapper[4760]: E0930 07:34:08.067099 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:08 crc kubenswrapper[4760]: E0930 07:34:08.067241 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.104037 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.104078 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.104088 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.104107 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.104120 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:08Z","lastTransitionTime":"2025-09-30T07:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.207775 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.207818 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.207832 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.207856 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.207870 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:08Z","lastTransitionTime":"2025-09-30T07:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.310560 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.310631 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.310650 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.310676 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.310697 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:08Z","lastTransitionTime":"2025-09-30T07:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.365540 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sspvl_2c4ca8ea-a714-40e5-9e10-080aef32237b/ovnkube-controller/1.log" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.366527 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sspvl_2c4ca8ea-a714-40e5-9e10-080aef32237b/ovnkube-controller/0.log" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.370276 4760 generic.go:334] "Generic (PLEG): container finished" podID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerID="e2eb770a63f2bb8de1ec68ee527d04f64bb32a66016581c3acfc665d9bb60784" exitCode=1 Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.370353 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" event={"ID":"2c4ca8ea-a714-40e5-9e10-080aef32237b","Type":"ContainerDied","Data":"e2eb770a63f2bb8de1ec68ee527d04f64bb32a66016581c3acfc665d9bb60784"} Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.370414 4760 scope.go:117] "RemoveContainer" containerID="15fa2b49d0420903a20e43cba97d14d07b922b6347396ae58206b8a4ed21c995" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.371816 4760 scope.go:117] "RemoveContainer" containerID="e2eb770a63f2bb8de1ec68ee527d04f64bb32a66016581c3acfc665d9bb60784" Sep 30 07:34:08 crc kubenswrapper[4760]: E0930 07:34:08.372150 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-sspvl_openshift-ovn-kubernetes(2c4ca8ea-a714-40e5-9e10-080aef32237b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.394847 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:08Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.413841 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.413891 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.413901 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.413922 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.413934 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:08Z","lastTransitionTime":"2025-09-30T07:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.417865 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:08Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.439812 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:08Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.455667 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:08Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.473930 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:08Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.495593 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:08Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.515047 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:08Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.517841 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.517889 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.517907 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.517936 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.517955 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:08Z","lastTransitionTime":"2025-09-30T07:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.538495 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:08Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.561071 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:08Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.582434 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:08Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.604549 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:08Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.621581 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.621667 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.621692 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.621725 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.621754 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:08Z","lastTransitionTime":"2025-09-30T07:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.623106 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4"] Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.624024 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.627219 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.627490 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.632086 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7192e617554ba6b5b9533792f166344fbf9681b63e57c180809e721cc18168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:08Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.672459 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:08Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.697242 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:08Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.724603 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.724710 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.724737 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.724769 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.724796 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:08Z","lastTransitionTime":"2025-09-30T07:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.731824 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eb770a63f2bb8de1ec68ee527d04f64bb32a66016581c3acfc665d9bb60784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eb770a63f2bb8de1ec68ee527d04f64bb32a66016581c3acfc665d9bb60784\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:07Z\\\",\\\"message\\\":\\\"vices.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-ingress/router-internal-default_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress/router-internal-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:1936, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 07:34:07.323120 6145 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0930 07:34:07.323753 6145 ovn.go:134] Ensuring zone \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-sspvl_openshift-ovn-kubernetes(2c4ca8ea-a714-40e5-9e10-080aef32237b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:08Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.763207 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eb770a63f2bb8de1ec68ee527d04f64bb32a66016581c3acfc665d9bb60784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eb770a63f2bb8de1ec68ee527d04f64bb32a66016581c3acfc665d9bb60784\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:07Z\\\",\\\"message\\\":\\\"vices.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-ingress/router-internal-default_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress/router-internal-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:1936, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 07:34:07.323120 6145 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0930 07:34:07.323753 6145 ovn.go:134] Ensuring zone \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-sspvl_openshift-ovn-kubernetes(2c4ca8ea-a714-40e5-9e10-080aef32237b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:08Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.776780 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2b6201d8-80fd-4701-a4a8-f7ebca1f34ad-env-overrides\") pod \"ovnkube-control-plane-749d76644c-lqpj4\" (UID: \"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.777043 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2b6201d8-80fd-4701-a4a8-f7ebca1f34ad-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-lqpj4\" (UID: \"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.777334 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2b6201d8-80fd-4701-a4a8-f7ebca1f34ad-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-lqpj4\" (UID: \"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.777477 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dc5j\" (UniqueName: \"kubernetes.io/projected/2b6201d8-80fd-4701-a4a8-f7ebca1f34ad-kube-api-access-2dc5j\") pod \"ovnkube-control-plane-749d76644c-lqpj4\" (UID: \"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.782968 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lqpj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:08Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.803791 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:08Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.822646 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:08Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.828280 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.828396 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.828418 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.828447 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.828468 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:08Z","lastTransitionTime":"2025-09-30T07:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.841495 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:08Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.856388 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:08Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.870932 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:08Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.878426 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2b6201d8-80fd-4701-a4a8-f7ebca1f34ad-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-lqpj4\" (UID: \"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.878458 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dc5j\" (UniqueName: \"kubernetes.io/projected/2b6201d8-80fd-4701-a4a8-f7ebca1f34ad-kube-api-access-2dc5j\") pod \"ovnkube-control-plane-749d76644c-lqpj4\" (UID: \"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.878509 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2b6201d8-80fd-4701-a4a8-f7ebca1f34ad-env-overrides\") pod \"ovnkube-control-plane-749d76644c-lqpj4\" (UID: \"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.878527 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2b6201d8-80fd-4701-a4a8-f7ebca1f34ad-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-lqpj4\" (UID: \"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.879987 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2b6201d8-80fd-4701-a4a8-f7ebca1f34ad-env-overrides\") pod \"ovnkube-control-plane-749d76644c-lqpj4\" (UID: \"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.880024 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2b6201d8-80fd-4701-a4a8-f7ebca1f34ad-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-lqpj4\" (UID: \"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.887684 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2b6201d8-80fd-4701-a4a8-f7ebca1f34ad-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-lqpj4\" (UID: \"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.890910 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:08Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.898720 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dc5j\" (UniqueName: \"kubernetes.io/projected/2b6201d8-80fd-4701-a4a8-f7ebca1f34ad-kube-api-access-2dc5j\") pod \"ovnkube-control-plane-749d76644c-lqpj4\" (UID: \"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.907935 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:08Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.926015 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:08Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.930931 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.931609 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.931650 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.931679 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.931699 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:08Z","lastTransitionTime":"2025-09-30T07:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.944525 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.945632 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:08Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:08 crc kubenswrapper[4760]: I0930 07:34:08.978004 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7192e617554ba6b5b9533792f166344fbf9681b63e57c180809e721cc18168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:08Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.007419 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:09Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.030508 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:09Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.035177 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.035203 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.035212 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.035229 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.035240 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:09Z","lastTransitionTime":"2025-09-30T07:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.051512 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:09Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.072372 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:09Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.138120 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.138188 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.138208 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.138256 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.138276 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:09Z","lastTransitionTime":"2025-09-30T07:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.241583 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.241637 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.241656 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.241685 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.241710 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:09Z","lastTransitionTime":"2025-09-30T07:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.345010 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.345049 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.345060 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.345084 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.345097 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:09Z","lastTransitionTime":"2025-09-30T07:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.380534 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" event={"ID":"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad","Type":"ContainerStarted","Data":"722e623d70449c90fb388b89511f1e75ed55015ec9caa45d9aaa9ac3d9649778"} Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.380609 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" event={"ID":"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad","Type":"ContainerStarted","Data":"09a33e356ac5e20ea894804e56d62f6220b8b9a5123d1b0acc9bbe33a3083792"} Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.380627 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" event={"ID":"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad","Type":"ContainerStarted","Data":"ab7d4b34f18afd8d2e0db84b2255a794a18483192a01162698fab5ed51ad7f43"} Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.383866 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sspvl_2c4ca8ea-a714-40e5-9e10-080aef32237b/ovnkube-controller/1.log" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.391509 4760 scope.go:117] "RemoveContainer" containerID="e2eb770a63f2bb8de1ec68ee527d04f64bb32a66016581c3acfc665d9bb60784" Sep 30 07:34:09 crc kubenswrapper[4760]: E0930 07:34:09.391814 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-sspvl_openshift-ovn-kubernetes(2c4ca8ea-a714-40e5-9e10-080aef32237b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.400625 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:09Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.415589 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:09Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.435133 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:09Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.448187 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.448254 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.448275 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.448329 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.448350 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:09Z","lastTransitionTime":"2025-09-30T07:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.451596 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:09Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.493113 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:09Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.525749 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:09Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.547635 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:09Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.550267 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.550363 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.550383 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.550409 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.550429 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:09Z","lastTransitionTime":"2025-09-30T07:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.565755 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:09Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.585559 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:09Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.605094 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:09Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.625186 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7192e617554ba6b5b9533792f166344fbf9681b63e57c180809e721cc18168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:09Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.653467 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.653556 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.653578 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.653608 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.653629 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:09Z","lastTransitionTime":"2025-09-30T07:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.661175 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eb770a63f2bb8de1ec68ee527d04f64bb32a66016581c3acfc665d9bb60784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eb770a63f2bb8de1ec68ee527d04f64bb32a66016581c3acfc665d9bb60784\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:07Z\\\",\\\"message\\\":\\\"vices.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-ingress/router-internal-default_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress/router-internal-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:1936, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 07:34:07.323120 6145 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0930 07:34:07.323753 6145 ovn.go:134] Ensuring zone \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-sspvl_openshift-ovn-kubernetes(2c4ca8ea-a714-40e5-9e10-080aef32237b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:09Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.684628 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:09Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.703076 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:09Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.719609 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:09Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.738959 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a33e356ac5e20ea894804e56d62f6220b8b9a5123d1b0acc9bbe33a3083792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://722e623d70449c90fb388b89511f1e75ed55015ec9caa45d9aaa9ac3d9649778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lqpj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:09Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.757580 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.757637 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.757649 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.757672 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.757685 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:09Z","lastTransitionTime":"2025-09-30T07:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.861777 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.861847 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.861866 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.861896 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.861927 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:09Z","lastTransitionTime":"2025-09-30T07:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.965053 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.965090 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.965101 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.965117 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:09 crc kubenswrapper[4760]: I0930 07:34:09.965127 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:09Z","lastTransitionTime":"2025-09-30T07:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.066070 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.066083 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.066084 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:10 crc kubenswrapper[4760]: E0930 07:34:10.066357 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:10 crc kubenswrapper[4760]: E0930 07:34:10.066439 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:10 crc kubenswrapper[4760]: E0930 07:34:10.066583 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.067703 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.067746 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.067764 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.067787 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.067807 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:10Z","lastTransitionTime":"2025-09-30T07:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.171658 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.171757 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.171778 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.171806 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.171828 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:10Z","lastTransitionTime":"2025-09-30T07:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.275353 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.275426 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.275448 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.275650 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.275668 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:10Z","lastTransitionTime":"2025-09-30T07:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.378889 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.378956 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.378974 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.378997 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.379015 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:10Z","lastTransitionTime":"2025-09-30T07:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.481864 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.481923 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.481963 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.481990 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.482007 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:10Z","lastTransitionTime":"2025-09-30T07:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.586181 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.586661 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.586688 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.586720 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.586742 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:10Z","lastTransitionTime":"2025-09-30T07:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.690328 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.690403 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.690427 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.690466 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.690490 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:10Z","lastTransitionTime":"2025-09-30T07:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.706910 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:34:10 crc kubenswrapper[4760]: E0930 07:34:10.707084 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:34:26.707049325 +0000 UTC m=+52.349955777 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.707207 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.707287 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:10 crc kubenswrapper[4760]: E0930 07:34:10.707464 4760 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 07:34:10 crc kubenswrapper[4760]: E0930 07:34:10.707493 4760 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 07:34:10 crc kubenswrapper[4760]: E0930 07:34:10.707557 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 07:34:26.707531918 +0000 UTC m=+52.350438370 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 07:34:10 crc kubenswrapper[4760]: E0930 07:34:10.707614 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 07:34:26.707586059 +0000 UTC m=+52.350492511 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.793794 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.793862 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.793881 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.793908 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.793929 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:10Z","lastTransitionTime":"2025-09-30T07:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.808568 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.808646 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:10 crc kubenswrapper[4760]: E0930 07:34:10.808780 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 07:34:10 crc kubenswrapper[4760]: E0930 07:34:10.808816 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 07:34:10 crc kubenswrapper[4760]: E0930 07:34:10.808837 4760 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:34:10 crc kubenswrapper[4760]: E0930 07:34:10.808836 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 07:34:10 crc kubenswrapper[4760]: E0930 07:34:10.808872 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 07:34:10 crc kubenswrapper[4760]: E0930 07:34:10.808894 4760 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:34:10 crc kubenswrapper[4760]: E0930 07:34:10.808922 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 07:34:26.808896942 +0000 UTC m=+52.451803394 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:34:10 crc kubenswrapper[4760]: E0930 07:34:10.808965 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 07:34:26.808942273 +0000 UTC m=+52.451848775 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.898370 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.898420 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.898438 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.898466 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.898484 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:10Z","lastTransitionTime":"2025-09-30T07:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.973032 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-wv8fz"] Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.973788 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:10 crc kubenswrapper[4760]: E0930 07:34:10.973885 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:34:10 crc kubenswrapper[4760]: I0930 07:34:10.995939 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:10Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.001293 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.001387 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.001406 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.001434 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.001453 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:11Z","lastTransitionTime":"2025-09-30T07:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.011757 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:11Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.028685 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:11Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.044533 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:11Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.064584 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:11Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.090614 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:11Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.104713 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.104776 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.104791 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.104813 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.104827 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:11Z","lastTransitionTime":"2025-09-30T07:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.111334 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-metrics-certs\") pod \"network-metrics-daemon-wv8fz\" (UID: \"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\") " pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.111416 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcwbs\" (UniqueName: \"kubernetes.io/projected/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-kube-api-access-xcwbs\") pod \"network-metrics-daemon-wv8fz\" (UID: \"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\") " pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.111668 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:11Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.130906 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:11Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.158893 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7192e617554ba6b5b9533792f166344fbf9681b63e57c180809e721cc18168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:11Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.195636 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:11Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.208172 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.208244 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.208263 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.208291 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.208365 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:11Z","lastTransitionTime":"2025-09-30T07:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.212898 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-metrics-certs\") pod \"network-metrics-daemon-wv8fz\" (UID: \"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\") " pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.212992 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcwbs\" (UniqueName: \"kubernetes.io/projected/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-kube-api-access-xcwbs\") pod \"network-metrics-daemon-wv8fz\" (UID: \"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\") " pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:11 crc kubenswrapper[4760]: E0930 07:34:11.213067 4760 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 07:34:11 crc kubenswrapper[4760]: E0930 07:34:11.213162 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-metrics-certs podName:ce6dcf25-c8ea-450b-9fc6-9f8aeafde757 nodeName:}" failed. No retries permitted until 2025-09-30 07:34:11.713135179 +0000 UTC m=+37.356041621 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-metrics-certs") pod "network-metrics-daemon-wv8fz" (UID: "ce6dcf25-c8ea-450b-9fc6-9f8aeafde757") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.218046 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:11Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.241857 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcwbs\" (UniqueName: \"kubernetes.io/projected/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-kube-api-access-xcwbs\") pod \"network-metrics-daemon-wv8fz\" (UID: \"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\") " pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.256029 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eb770a63f2bb8de1ec68ee527d04f64bb32a66016581c3acfc665d9bb60784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eb770a63f2bb8de1ec68ee527d04f64bb32a66016581c3acfc665d9bb60784\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:07Z\\\",\\\"message\\\":\\\"vices.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-ingress/router-internal-default_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress/router-internal-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:1936, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 07:34:07.323120 6145 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0930 07:34:07.323753 6145 ovn.go:134] Ensuring zone \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-sspvl_openshift-ovn-kubernetes(2c4ca8ea-a714-40e5-9e10-080aef32237b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:11Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.277986 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:11Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.294781 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:11Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.312427 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.312531 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.312602 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.312642 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.312667 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:11Z","lastTransitionTime":"2025-09-30T07:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.314525 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a33e356ac5e20ea894804e56d62f6220b8b9a5123d1b0acc9bbe33a3083792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://722e623d70449c90fb388b89511f1e75ed55015ec9caa45d9aaa9ac3d9649778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lqpj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:11Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.332752 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wv8fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wv8fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:11Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.352835 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:11Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.416485 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.416578 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.416633 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.416657 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.416691 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:11Z","lastTransitionTime":"2025-09-30T07:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.519473 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.519512 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.519524 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.519541 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.519552 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:11Z","lastTransitionTime":"2025-09-30T07:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.617461 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.617532 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.617544 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.617581 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.617598 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:11Z","lastTransitionTime":"2025-09-30T07:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:11 crc kubenswrapper[4760]: E0930 07:34:11.634215 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:11Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.639808 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.639856 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.639868 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.639892 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.639906 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:11Z","lastTransitionTime":"2025-09-30T07:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:11 crc kubenswrapper[4760]: E0930 07:34:11.655460 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:11Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.660077 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.660133 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.660146 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.660166 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.660179 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:11Z","lastTransitionTime":"2025-09-30T07:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:11 crc kubenswrapper[4760]: E0930 07:34:11.682392 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:11Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.687644 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.687700 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.687721 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.687746 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.687763 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:11Z","lastTransitionTime":"2025-09-30T07:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:11 crc kubenswrapper[4760]: E0930 07:34:11.707433 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:11Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.712343 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.712398 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.712411 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.712433 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.712451 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:11Z","lastTransitionTime":"2025-09-30T07:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.718525 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-metrics-certs\") pod \"network-metrics-daemon-wv8fz\" (UID: \"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\") " pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:11 crc kubenswrapper[4760]: E0930 07:34:11.718719 4760 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 07:34:11 crc kubenswrapper[4760]: E0930 07:34:11.718849 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-metrics-certs podName:ce6dcf25-c8ea-450b-9fc6-9f8aeafde757 nodeName:}" failed. No retries permitted until 2025-09-30 07:34:12.718818663 +0000 UTC m=+38.361725245 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-metrics-certs") pod "network-metrics-daemon-wv8fz" (UID: "ce6dcf25-c8ea-450b-9fc6-9f8aeafde757") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 07:34:11 crc kubenswrapper[4760]: E0930 07:34:11.733089 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:11Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:11 crc kubenswrapper[4760]: E0930 07:34:11.733263 4760 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.736138 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.736219 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.736237 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.736257 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.736271 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:11Z","lastTransitionTime":"2025-09-30T07:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.839413 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.839508 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.839527 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.839553 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.839575 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:11Z","lastTransitionTime":"2025-09-30T07:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.942668 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.942746 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.942764 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.942793 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:11 crc kubenswrapper[4760]: I0930 07:34:11.942811 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:11Z","lastTransitionTime":"2025-09-30T07:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.048677 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.048759 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.048779 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.048810 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.048865 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:12Z","lastTransitionTime":"2025-09-30T07:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.066141 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.066246 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.066264 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:12 crc kubenswrapper[4760]: E0930 07:34:12.066403 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:12 crc kubenswrapper[4760]: E0930 07:34:12.066498 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:12 crc kubenswrapper[4760]: E0930 07:34:12.066622 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.152860 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.152934 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.152953 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.152978 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.152997 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:12Z","lastTransitionTime":"2025-09-30T07:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.256532 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.256595 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.256608 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.256631 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.256649 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:12Z","lastTransitionTime":"2025-09-30T07:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.359963 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.360010 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.360022 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.360041 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.360056 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:12Z","lastTransitionTime":"2025-09-30T07:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.463696 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.463753 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.463771 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.463796 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.463815 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:12Z","lastTransitionTime":"2025-09-30T07:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.566964 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.567049 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.567069 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.567096 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.567116 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:12Z","lastTransitionTime":"2025-09-30T07:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.670610 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.670675 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.670697 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.670747 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.670771 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:12Z","lastTransitionTime":"2025-09-30T07:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.727917 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-metrics-certs\") pod \"network-metrics-daemon-wv8fz\" (UID: \"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\") " pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:12 crc kubenswrapper[4760]: E0930 07:34:12.728152 4760 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 07:34:12 crc kubenswrapper[4760]: E0930 07:34:12.728270 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-metrics-certs podName:ce6dcf25-c8ea-450b-9fc6-9f8aeafde757 nodeName:}" failed. No retries permitted until 2025-09-30 07:34:14.728238971 +0000 UTC m=+40.371145433 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-metrics-certs") pod "network-metrics-daemon-wv8fz" (UID: "ce6dcf25-c8ea-450b-9fc6-9f8aeafde757") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.774529 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.774591 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.774609 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.774657 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.774677 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:12Z","lastTransitionTime":"2025-09-30T07:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.878074 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.878138 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.878160 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.878187 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.878207 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:12Z","lastTransitionTime":"2025-09-30T07:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.981852 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.981929 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.981950 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.981976 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:12 crc kubenswrapper[4760]: I0930 07:34:12.981993 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:12Z","lastTransitionTime":"2025-09-30T07:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.067082 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:13 crc kubenswrapper[4760]: E0930 07:34:13.067272 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.086522 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.086604 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.086623 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.086655 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.086676 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:13Z","lastTransitionTime":"2025-09-30T07:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.190392 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.190451 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.190464 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.190487 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.190502 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:13Z","lastTransitionTime":"2025-09-30T07:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.294506 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.294910 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.295119 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.295368 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.295543 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:13Z","lastTransitionTime":"2025-09-30T07:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.398826 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.398868 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.398880 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.398897 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.398912 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:13Z","lastTransitionTime":"2025-09-30T07:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.501160 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.501539 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.501629 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.501750 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.501829 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:13Z","lastTransitionTime":"2025-09-30T07:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.605071 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.605133 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.605151 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.605180 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.605200 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:13Z","lastTransitionTime":"2025-09-30T07:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.707441 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.707858 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.708105 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.708358 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.708609 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:13Z","lastTransitionTime":"2025-09-30T07:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.811382 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.811846 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.811983 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.812132 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.812265 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:13Z","lastTransitionTime":"2025-09-30T07:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.915557 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.915644 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.915680 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.915707 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:13 crc kubenswrapper[4760]: I0930 07:34:13.915725 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:13Z","lastTransitionTime":"2025-09-30T07:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.019487 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.019958 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.020115 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.020280 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.020444 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:14Z","lastTransitionTime":"2025-09-30T07:34:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.066378 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:14 crc kubenswrapper[4760]: E0930 07:34:14.066576 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.066417 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:14 crc kubenswrapper[4760]: E0930 07:34:14.066690 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.066378 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:14 crc kubenswrapper[4760]: E0930 07:34:14.066779 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.123064 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.123500 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.123666 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.123808 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.123950 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:14Z","lastTransitionTime":"2025-09-30T07:34:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.227343 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.227439 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.227465 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.227497 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.227518 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:14Z","lastTransitionTime":"2025-09-30T07:34:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.331284 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.331379 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.331399 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.331610 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.331629 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:14Z","lastTransitionTime":"2025-09-30T07:34:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.435568 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.435659 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.435689 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.435724 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.435746 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:14Z","lastTransitionTime":"2025-09-30T07:34:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.538608 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.538672 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.538690 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.538713 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.538729 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:14Z","lastTransitionTime":"2025-09-30T07:34:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.642658 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.642730 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.642753 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.642783 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.642804 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:14Z","lastTransitionTime":"2025-09-30T07:34:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.746541 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.746600 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.746614 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.746632 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.746643 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:14Z","lastTransitionTime":"2025-09-30T07:34:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.746859 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-metrics-certs\") pod \"network-metrics-daemon-wv8fz\" (UID: \"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\") " pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:14 crc kubenswrapper[4760]: E0930 07:34:14.747057 4760 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 07:34:14 crc kubenswrapper[4760]: E0930 07:34:14.747125 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-metrics-certs podName:ce6dcf25-c8ea-450b-9fc6-9f8aeafde757 nodeName:}" failed. No retries permitted until 2025-09-30 07:34:18.747102556 +0000 UTC m=+44.390009018 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-metrics-certs") pod "network-metrics-daemon-wv8fz" (UID: "ce6dcf25-c8ea-450b-9fc6-9f8aeafde757") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.850436 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.850516 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.850541 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.850574 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.850598 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:14Z","lastTransitionTime":"2025-09-30T07:34:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.953844 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.953897 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.953914 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.953937 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:14 crc kubenswrapper[4760]: I0930 07:34:14.953955 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:14Z","lastTransitionTime":"2025-09-30T07:34:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.056785 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.056860 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.056883 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.056912 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.056936 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:15Z","lastTransitionTime":"2025-09-30T07:34:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.066800 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:15 crc kubenswrapper[4760]: E0930 07:34:15.067049 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.082550 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:15Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.101202 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a33e356ac5e20ea894804e56d62f6220b8b9a5123d1b0acc9bbe33a3083792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://722e623d70449c90fb388b89511f1e75ed55015ec9caa45d9aaa9ac3d9649778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lqpj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:15Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.118411 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wv8fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wv8fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:15Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.138816 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:15Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.154677 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:15Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.159408 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.159473 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.159493 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.159518 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.159535 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:15Z","lastTransitionTime":"2025-09-30T07:34:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.171295 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:15Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.192282 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:15Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.208109 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:15Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.229335 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:15Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.250906 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:15Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.262520 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.262589 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.262610 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.262642 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.262662 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:15Z","lastTransitionTime":"2025-09-30T07:34:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.273040 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:15Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.294646 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:15Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.321091 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7192e617554ba6b5b9533792f166344fbf9681b63e57c180809e721cc18168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:15Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.359933 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:15Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.366543 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.366602 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.366622 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.366648 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.366666 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:15Z","lastTransitionTime":"2025-09-30T07:34:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.382616 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:15Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.400999 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:15Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.434665 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eb770a63f2bb8de1ec68ee527d04f64bb32a66016581c3acfc665d9bb60784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eb770a63f2bb8de1ec68ee527d04f64bb32a66016581c3acfc665d9bb60784\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:07Z\\\",\\\"message\\\":\\\"vices.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-ingress/router-internal-default_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress/router-internal-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:1936, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 07:34:07.323120 6145 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0930 07:34:07.323753 6145 ovn.go:134] Ensuring zone \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-sspvl_openshift-ovn-kubernetes(2c4ca8ea-a714-40e5-9e10-080aef32237b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:15Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.470935 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.471007 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.471027 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.471055 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.471074 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:15Z","lastTransitionTime":"2025-09-30T07:34:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.574602 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.574655 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.574672 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.574700 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.574719 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:15Z","lastTransitionTime":"2025-09-30T07:34:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.678481 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.678554 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.678579 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.678610 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.678631 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:15Z","lastTransitionTime":"2025-09-30T07:34:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.781850 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.781920 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.781939 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.781967 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.781985 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:15Z","lastTransitionTime":"2025-09-30T07:34:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.885776 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.886186 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.886517 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.886710 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.886861 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:15Z","lastTransitionTime":"2025-09-30T07:34:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.990244 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.990350 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.990375 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.990403 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:15 crc kubenswrapper[4760]: I0930 07:34:15.990425 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:15Z","lastTransitionTime":"2025-09-30T07:34:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.065884 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.065957 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.066040 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:16 crc kubenswrapper[4760]: E0930 07:34:16.066072 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:16 crc kubenswrapper[4760]: E0930 07:34:16.066192 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:16 crc kubenswrapper[4760]: E0930 07:34:16.066374 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.093790 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.093835 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.093853 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.093875 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.093893 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:16Z","lastTransitionTime":"2025-09-30T07:34:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.197500 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.197547 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.197564 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.197586 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.197603 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:16Z","lastTransitionTime":"2025-09-30T07:34:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.301521 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.301583 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.301602 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.301632 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.301659 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:16Z","lastTransitionTime":"2025-09-30T07:34:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.405446 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.405522 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.405542 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.405569 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.405587 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:16Z","lastTransitionTime":"2025-09-30T07:34:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.508144 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.508183 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.508193 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.508207 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.508215 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:16Z","lastTransitionTime":"2025-09-30T07:34:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.611699 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.611740 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.611749 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.611765 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.611774 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:16Z","lastTransitionTime":"2025-09-30T07:34:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.714780 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.714838 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.714856 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.714881 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.714900 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:16Z","lastTransitionTime":"2025-09-30T07:34:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.817836 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.817907 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.817919 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.817936 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.817950 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:16Z","lastTransitionTime":"2025-09-30T07:34:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.920654 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.920714 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.920732 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.920755 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:16 crc kubenswrapper[4760]: I0930 07:34:16.920772 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:16Z","lastTransitionTime":"2025-09-30T07:34:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.023932 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.023972 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.023984 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.024023 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.024034 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:17Z","lastTransitionTime":"2025-09-30T07:34:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.067207 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:17 crc kubenswrapper[4760]: E0930 07:34:17.067439 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.127526 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.127596 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.127616 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.127643 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.127660 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:17Z","lastTransitionTime":"2025-09-30T07:34:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.231178 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.231255 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.231274 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.231327 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.231346 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:17Z","lastTransitionTime":"2025-09-30T07:34:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.334830 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.334897 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.334915 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.334943 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.334963 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:17Z","lastTransitionTime":"2025-09-30T07:34:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.437895 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.437957 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.437975 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.438002 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.438021 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:17Z","lastTransitionTime":"2025-09-30T07:34:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.541066 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.541131 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.541160 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.541187 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.541207 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:17Z","lastTransitionTime":"2025-09-30T07:34:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.644547 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.644628 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.644647 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.644672 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.644691 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:17Z","lastTransitionTime":"2025-09-30T07:34:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.748107 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.748176 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.748194 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.748218 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.748236 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:17Z","lastTransitionTime":"2025-09-30T07:34:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.858817 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.858890 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.858914 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.858951 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.858970 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:17Z","lastTransitionTime":"2025-09-30T07:34:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.963745 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.963816 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.963838 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.963870 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:17 crc kubenswrapper[4760]: I0930 07:34:17.963892 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:17Z","lastTransitionTime":"2025-09-30T07:34:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.065978 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.066009 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.066047 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:18 crc kubenswrapper[4760]: E0930 07:34:18.066289 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:18 crc kubenswrapper[4760]: E0930 07:34:18.066553 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:18 crc kubenswrapper[4760]: E0930 07:34:18.066728 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.067220 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.067260 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.067278 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.067331 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.067349 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:18Z","lastTransitionTime":"2025-09-30T07:34:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.169984 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.170041 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.170060 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.170086 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.170103 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:18Z","lastTransitionTime":"2025-09-30T07:34:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.273648 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.273713 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.273730 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.273756 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.273776 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:18Z","lastTransitionTime":"2025-09-30T07:34:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.377069 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.377123 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.377160 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.377197 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.377220 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:18Z","lastTransitionTime":"2025-09-30T07:34:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.479417 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.479454 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.479466 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.479482 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.479494 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:18Z","lastTransitionTime":"2025-09-30T07:34:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.582079 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.582148 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.582171 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.582262 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.582290 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:18Z","lastTransitionTime":"2025-09-30T07:34:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.685615 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.685687 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.685713 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.685742 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.685762 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:18Z","lastTransitionTime":"2025-09-30T07:34:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.788526 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.788623 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.788643 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.788670 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.788689 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:18Z","lastTransitionTime":"2025-09-30T07:34:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.801120 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-metrics-certs\") pod \"network-metrics-daemon-wv8fz\" (UID: \"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\") " pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:18 crc kubenswrapper[4760]: E0930 07:34:18.801389 4760 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 07:34:18 crc kubenswrapper[4760]: E0930 07:34:18.801485 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-metrics-certs podName:ce6dcf25-c8ea-450b-9fc6-9f8aeafde757 nodeName:}" failed. No retries permitted until 2025-09-30 07:34:26.801458006 +0000 UTC m=+52.444364448 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-metrics-certs") pod "network-metrics-daemon-wv8fz" (UID: "ce6dcf25-c8ea-450b-9fc6-9f8aeafde757") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.891861 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.891922 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.891941 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.891964 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.891983 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:18Z","lastTransitionTime":"2025-09-30T07:34:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.994629 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.994699 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.994721 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.994745 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:18 crc kubenswrapper[4760]: I0930 07:34:18.994765 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:18Z","lastTransitionTime":"2025-09-30T07:34:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.066710 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:19 crc kubenswrapper[4760]: E0930 07:34:19.066939 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.107839 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.107891 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.107908 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.107933 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.107949 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:19Z","lastTransitionTime":"2025-09-30T07:34:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.211213 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.211269 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.211287 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.211334 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.211351 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:19Z","lastTransitionTime":"2025-09-30T07:34:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.314386 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.314437 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.314455 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.314480 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.314497 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:19Z","lastTransitionTime":"2025-09-30T07:34:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.417722 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.417808 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.417827 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.417851 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.417867 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:19Z","lastTransitionTime":"2025-09-30T07:34:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.521035 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.521097 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.521117 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.521142 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.521159 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:19Z","lastTransitionTime":"2025-09-30T07:34:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.624843 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.624912 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.624930 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.624956 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.624976 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:19Z","lastTransitionTime":"2025-09-30T07:34:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.728144 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.728203 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.728222 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.728245 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.728262 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:19Z","lastTransitionTime":"2025-09-30T07:34:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.832079 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.832185 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.832207 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.832239 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.832259 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:19Z","lastTransitionTime":"2025-09-30T07:34:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.936359 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.936467 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.936486 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.936513 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:19 crc kubenswrapper[4760]: I0930 07:34:19.936531 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:19Z","lastTransitionTime":"2025-09-30T07:34:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.040831 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.040893 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.040911 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.040935 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.040949 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:20Z","lastTransitionTime":"2025-09-30T07:34:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.066068 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.066160 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.066164 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:20 crc kubenswrapper[4760]: E0930 07:34:20.066412 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:20 crc kubenswrapper[4760]: E0930 07:34:20.066547 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:20 crc kubenswrapper[4760]: E0930 07:34:20.066643 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.144159 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.144227 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.144250 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.144280 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.144344 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:20Z","lastTransitionTime":"2025-09-30T07:34:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.247988 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.248082 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.248107 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.248138 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.248162 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:20Z","lastTransitionTime":"2025-09-30T07:34:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.351452 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.351520 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.351537 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.351562 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.351582 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:20Z","lastTransitionTime":"2025-09-30T07:34:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.455339 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.455411 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.455434 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.455458 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.455475 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:20Z","lastTransitionTime":"2025-09-30T07:34:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.559334 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.559410 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.559429 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.559457 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.559478 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:20Z","lastTransitionTime":"2025-09-30T07:34:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.667617 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.667683 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.667704 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.667728 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.667747 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:20Z","lastTransitionTime":"2025-09-30T07:34:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.770637 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.770695 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.770714 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.770739 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.770757 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:20Z","lastTransitionTime":"2025-09-30T07:34:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.874440 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.874521 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.874540 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.874942 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.874973 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:20Z","lastTransitionTime":"2025-09-30T07:34:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.978645 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.978695 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.978712 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.978737 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:20 crc kubenswrapper[4760]: I0930 07:34:20.978754 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:20Z","lastTransitionTime":"2025-09-30T07:34:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.067082 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:21 crc kubenswrapper[4760]: E0930 07:34:21.067455 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.068571 4760 scope.go:117] "RemoveContainer" containerID="e2eb770a63f2bb8de1ec68ee527d04f64bb32a66016581c3acfc665d9bb60784" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.081451 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.081505 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.081524 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.081549 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.081566 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:21Z","lastTransitionTime":"2025-09-30T07:34:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.184994 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.185420 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.185440 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.185470 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.185489 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:21Z","lastTransitionTime":"2025-09-30T07:34:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.290253 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.290350 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.290375 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.290402 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.290421 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:21Z","lastTransitionTime":"2025-09-30T07:34:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.394410 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.394465 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.394484 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.394509 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.394530 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:21Z","lastTransitionTime":"2025-09-30T07:34:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.437577 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sspvl_2c4ca8ea-a714-40e5-9e10-080aef32237b/ovnkube-controller/1.log" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.440648 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" event={"ID":"2c4ca8ea-a714-40e5-9e10-080aef32237b","Type":"ContainerStarted","Data":"b84d165d7a40638496b6759c4c2c265fdf742b2c9c968c503352f468f776b7d7"} Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.440815 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.472242 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84d165d7a40638496b6759c4c2c265fdf742b2c9c968c503352f468f776b7d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eb770a63f2bb8de1ec68ee527d04f64bb32a66016581c3acfc665d9bb60784\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:07Z\\\",\\\"message\\\":\\\"vices.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-ingress/router-internal-default_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress/router-internal-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:1936, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 07:34:07.323120 6145 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0930 07:34:07.323753 6145 ovn.go:134] Ensuring zone \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:21Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.491079 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wv8fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wv8fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:21Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.497523 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.497553 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.497562 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.497578 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.497610 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:21Z","lastTransitionTime":"2025-09-30T07:34:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.517334 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:21Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.536705 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:21Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.552420 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:21Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.565924 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a33e356ac5e20ea894804e56d62f6220b8b9a5123d1b0acc9bbe33a3083792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://722e623d70449c90fb388b89511f1e75ed55015ec9caa45d9aaa9ac3d9649778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lqpj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:21Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.586484 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:21Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.600715 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.600770 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.600782 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.600802 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.600816 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:21Z","lastTransitionTime":"2025-09-30T07:34:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.607354 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:21Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.634004 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:21Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.645569 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:21Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.664617 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:21Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.695168 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7192e617554ba6b5b9533792f166344fbf9681b63e57c180809e721cc18168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:21Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.704272 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.704521 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.704617 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.704707 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.704783 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:21Z","lastTransitionTime":"2025-09-30T07:34:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.722024 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:21Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.739586 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:21Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.756202 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:21Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.772756 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:21Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.786135 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:21Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.807776 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.807842 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.807857 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.808448 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.808546 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:21Z","lastTransitionTime":"2025-09-30T07:34:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.911961 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.912036 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.912057 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.912085 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.912105 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:21Z","lastTransitionTime":"2025-09-30T07:34:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.966453 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.966519 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.966543 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.966576 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.966600 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:21Z","lastTransitionTime":"2025-09-30T07:34:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:21 crc kubenswrapper[4760]: E0930 07:34:21.986949 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:21Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.992570 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.992612 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.992622 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.992639 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:21 crc kubenswrapper[4760]: I0930 07:34:21.992650 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:21Z","lastTransitionTime":"2025-09-30T07:34:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:22 crc kubenswrapper[4760]: E0930 07:34:22.012533 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:22Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.018143 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.018208 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.018234 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.018264 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.018288 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:22Z","lastTransitionTime":"2025-09-30T07:34:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:22 crc kubenswrapper[4760]: E0930 07:34:22.041789 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:22Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.047123 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.047170 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.047188 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.047214 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.047232 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:22Z","lastTransitionTime":"2025-09-30T07:34:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:22 crc kubenswrapper[4760]: E0930 07:34:22.065414 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:22Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.065957 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.065968 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.066094 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:22 crc kubenswrapper[4760]: E0930 07:34:22.066283 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:22 crc kubenswrapper[4760]: E0930 07:34:22.066513 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:22 crc kubenswrapper[4760]: E0930 07:34:22.066657 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.071498 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.071557 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.071578 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.071607 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.071625 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:22Z","lastTransitionTime":"2025-09-30T07:34:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:22 crc kubenswrapper[4760]: E0930 07:34:22.129726 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:22Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:22 crc kubenswrapper[4760]: E0930 07:34:22.130113 4760 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.132520 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.132563 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.132582 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.132609 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.132628 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:22Z","lastTransitionTime":"2025-09-30T07:34:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.235778 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.235850 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.235870 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.235896 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.235913 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:22Z","lastTransitionTime":"2025-09-30T07:34:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.339843 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.339883 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.339892 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.339910 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.339920 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:22Z","lastTransitionTime":"2025-09-30T07:34:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.443497 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.443543 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.443560 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.443585 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.443601 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:22Z","lastTransitionTime":"2025-09-30T07:34:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.448230 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sspvl_2c4ca8ea-a714-40e5-9e10-080aef32237b/ovnkube-controller/2.log" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.449433 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sspvl_2c4ca8ea-a714-40e5-9e10-080aef32237b/ovnkube-controller/1.log" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.453902 4760 generic.go:334] "Generic (PLEG): container finished" podID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerID="b84d165d7a40638496b6759c4c2c265fdf742b2c9c968c503352f468f776b7d7" exitCode=1 Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.453961 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" event={"ID":"2c4ca8ea-a714-40e5-9e10-080aef32237b","Type":"ContainerDied","Data":"b84d165d7a40638496b6759c4c2c265fdf742b2c9c968c503352f468f776b7d7"} Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.454022 4760 scope.go:117] "RemoveContainer" containerID="e2eb770a63f2bb8de1ec68ee527d04f64bb32a66016581c3acfc665d9bb60784" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.456509 4760 scope.go:117] "RemoveContainer" containerID="b84d165d7a40638496b6759c4c2c265fdf742b2c9c968c503352f468f776b7d7" Sep 30 07:34:22 crc kubenswrapper[4760]: E0930 07:34:22.457206 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sspvl_openshift-ovn-kubernetes(2c4ca8ea-a714-40e5-9e10-080aef32237b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.475659 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:22Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.496628 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7192e617554ba6b5b9533792f166344fbf9681b63e57c180809e721cc18168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:22Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.527291 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:22Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.545943 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.546014 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.546040 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.546073 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.546099 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:22Z","lastTransitionTime":"2025-09-30T07:34:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.551551 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:22Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.572052 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:22Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.590264 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:22Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.610073 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:22Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.641135 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84d165d7a40638496b6759c4c2c265fdf742b2c9c968c503352f468f776b7d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eb770a63f2bb8de1ec68ee527d04f64bb32a66016581c3acfc665d9bb60784\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:07Z\\\",\\\"message\\\":\\\"vices.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-ingress/router-internal-default_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress/router-internal-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:1936, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 07:34:07.323120 6145 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0930 07:34:07.323753 6145 ovn.go:134] Ensuring zone \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b84d165d7a40638496b6759c4c2c265fdf742b2c9c968c503352f468f776b7d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"message\\\":\\\"ctory.go:160\\\\nI0930 07:34:22.231404 6358 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 07:34:22.232586 6358 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 07:34:22.232624 6358 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 07:34:22.232630 6358 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 07:34:22.232651 6358 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 07:34:22.232660 6358 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 07:34:22.232662 6358 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 07:34:22.232664 6358 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 07:34:22.232685 6358 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 07:34:22.232682 6358 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 07:34:22.232709 6358 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 07:34:22.232740 6358 factory.go:656] Stopping watch factory\\\\nI0930 07:34:22.232758 6358 ovnkube.go:599] Stopped ovnkube\\\\nI0930 07:34:22.232760 6358 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 07:34:2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:22Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.649097 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.649170 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.649188 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.649213 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.649232 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:22Z","lastTransitionTime":"2025-09-30T07:34:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.663939 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wv8fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wv8fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:22Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.684052 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:22Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.707600 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:22Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.719070 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:22Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.735011 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a33e356ac5e20ea894804e56d62f6220b8b9a5123d1b0acc9bbe33a3083792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://722e623d70449c90fb388b89511f1e75ed55015ec9caa45d9aaa9ac3d9649778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lqpj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:22Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.750964 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:22Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.752212 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.752289 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.752347 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.752380 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.752404 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:22Z","lastTransitionTime":"2025-09-30T07:34:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.765530 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:22Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.782490 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:22Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.797743 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:22Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.855583 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.855646 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.855678 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.855707 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.855729 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:22Z","lastTransitionTime":"2025-09-30T07:34:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.959058 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.959121 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.959139 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.959164 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:22 crc kubenswrapper[4760]: I0930 07:34:22.959182 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:22Z","lastTransitionTime":"2025-09-30T07:34:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.062600 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.062664 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.062682 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.062713 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.062750 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:23Z","lastTransitionTime":"2025-09-30T07:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.066175 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:23 crc kubenswrapper[4760]: E0930 07:34:23.066379 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.165969 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.166005 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.166015 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.166027 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.166036 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:23Z","lastTransitionTime":"2025-09-30T07:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.228516 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.269166 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.269213 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.269227 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.269246 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.269258 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:23Z","lastTransitionTime":"2025-09-30T07:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.372734 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.372802 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.372821 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.372847 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.372868 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:23Z","lastTransitionTime":"2025-09-30T07:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.460616 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sspvl_2c4ca8ea-a714-40e5-9e10-080aef32237b/ovnkube-controller/2.log" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.467525 4760 scope.go:117] "RemoveContainer" containerID="b84d165d7a40638496b6759c4c2c265fdf742b2c9c968c503352f468f776b7d7" Sep 30 07:34:23 crc kubenswrapper[4760]: E0930 07:34:23.467804 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sspvl_openshift-ovn-kubernetes(2c4ca8ea-a714-40e5-9e10-080aef32237b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.475376 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.475452 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.475471 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.475496 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.475513 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:23Z","lastTransitionTime":"2025-09-30T07:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.485099 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wv8fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wv8fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:23Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.503714 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:23Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.525550 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:23Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.541590 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:23Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.560586 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a33e356ac5e20ea894804e56d62f6220b8b9a5123d1b0acc9bbe33a3083792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://722e623d70449c90fb388b89511f1e75ed55015ec9caa45d9aaa9ac3d9649778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lqpj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:23Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.577399 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:23Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.578961 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.579015 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.579035 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.579062 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.579088 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:23Z","lastTransitionTime":"2025-09-30T07:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.595243 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:23Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.617546 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:23Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.634697 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:23Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.650772 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:23Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.669814 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7192e617554ba6b5b9533792f166344fbf9681b63e57c180809e721cc18168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:23Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.682118 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.682200 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.682225 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.682257 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.682278 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:23Z","lastTransitionTime":"2025-09-30T07:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.704690 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:23Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.727794 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:23Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.747031 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:23Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.762695 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:23Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.781051 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:23Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.785337 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.785392 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.785420 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.785438 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.785450 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:23Z","lastTransitionTime":"2025-09-30T07:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.812255 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84d165d7a40638496b6759c4c2c265fdf742b2c9c968c503352f468f776b7d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b84d165d7a40638496b6759c4c2c265fdf742b2c9c968c503352f468f776b7d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"message\\\":\\\"ctory.go:160\\\\nI0930 07:34:22.231404 6358 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 07:34:22.232586 6358 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 07:34:22.232624 6358 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 07:34:22.232630 6358 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 07:34:22.232651 6358 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 07:34:22.232660 6358 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 07:34:22.232662 6358 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 07:34:22.232664 6358 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 07:34:22.232685 6358 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 07:34:22.232682 6358 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 07:34:22.232709 6358 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 07:34:22.232740 6358 factory.go:656] Stopping watch factory\\\\nI0930 07:34:22.232758 6358 ovnkube.go:599] Stopped ovnkube\\\\nI0930 07:34:22.232760 6358 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 07:34:2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sspvl_openshift-ovn-kubernetes(2c4ca8ea-a714-40e5-9e10-080aef32237b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:23Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.887924 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.888006 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.888030 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.888061 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.888084 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:23Z","lastTransitionTime":"2025-09-30T07:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.991737 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.991808 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.991829 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.991858 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:23 crc kubenswrapper[4760]: I0930 07:34:23.991879 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:23Z","lastTransitionTime":"2025-09-30T07:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.066066 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:24 crc kubenswrapper[4760]: E0930 07:34:24.066239 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.066090 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.066340 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:24 crc kubenswrapper[4760]: E0930 07:34:24.066401 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:24 crc kubenswrapper[4760]: E0930 07:34:24.066547 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.095069 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.095125 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.095147 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.095176 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.095202 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:24Z","lastTransitionTime":"2025-09-30T07:34:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.198161 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.198227 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.198245 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.198270 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.198287 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:24Z","lastTransitionTime":"2025-09-30T07:34:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.301060 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.301120 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.301139 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.301165 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.301182 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:24Z","lastTransitionTime":"2025-09-30T07:34:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.404358 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.404454 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.404473 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.404498 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.404515 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:24Z","lastTransitionTime":"2025-09-30T07:34:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.506981 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.507023 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.507035 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.507052 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.507064 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:24Z","lastTransitionTime":"2025-09-30T07:34:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.609361 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.609393 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.609404 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.609419 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.609431 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:24Z","lastTransitionTime":"2025-09-30T07:34:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.712617 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.712694 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.712719 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.712785 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.712810 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:24Z","lastTransitionTime":"2025-09-30T07:34:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.815931 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.815985 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.816003 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.816064 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.816083 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:24Z","lastTransitionTime":"2025-09-30T07:34:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.919779 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.919838 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.919858 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.919888 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:24 crc kubenswrapper[4760]: I0930 07:34:24.919940 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:24Z","lastTransitionTime":"2025-09-30T07:34:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.023468 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.023528 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.023546 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.023569 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.023588 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:25Z","lastTransitionTime":"2025-09-30T07:34:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.066007 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:25 crc kubenswrapper[4760]: E0930 07:34:25.066260 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.087393 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.099375 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.105376 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84d165d7a40638496b6759c4c2c265fdf742b2c9c968c503352f468f776b7d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b84d165d7a40638496b6759c4c2c265fdf742b2c9c968c503352f468f776b7d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"message\\\":\\\"ctory.go:160\\\\nI0930 07:34:22.231404 6358 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 07:34:22.232586 6358 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 07:34:22.232624 6358 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 07:34:22.232630 6358 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 07:34:22.232651 6358 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 07:34:22.232660 6358 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 07:34:22.232662 6358 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 07:34:22.232664 6358 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 07:34:22.232685 6358 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 07:34:22.232682 6358 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 07:34:22.232709 6358 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 07:34:22.232740 6358 factory.go:656] Stopping watch factory\\\\nI0930 07:34:22.232758 6358 ovnkube.go:599] Stopped ovnkube\\\\nI0930 07:34:22.232760 6358 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 07:34:2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sspvl_openshift-ovn-kubernetes(2c4ca8ea-a714-40e5-9e10-080aef32237b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.121558 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wv8fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wv8fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.127571 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.127609 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.127621 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.127637 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.127649 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:25Z","lastTransitionTime":"2025-09-30T07:34:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.140109 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.161394 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.179918 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.202831 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a33e356ac5e20ea894804e56d62f6220b8b9a5123d1b0acc9bbe33a3083792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://722e623d70449c90fb388b89511f1e75ed55015ec9caa45d9aaa9ac3d9649778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lqpj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.222447 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.229966 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.230008 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.230019 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.230034 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.230047 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:25Z","lastTransitionTime":"2025-09-30T07:34:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.238548 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.258768 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.270250 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.285542 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.303640 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7192e617554ba6b5b9533792f166344fbf9681b63e57c180809e721cc18168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.329568 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.332625 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.332715 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.332742 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.332778 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.332806 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:25Z","lastTransitionTime":"2025-09-30T07:34:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.350930 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.371374 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.388389 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.409380 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.429602 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.436054 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.436103 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.436122 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.436146 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.436164 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:25Z","lastTransitionTime":"2025-09-30T07:34:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.451618 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.468869 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.485757 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a33e356ac5e20ea894804e56d62f6220b8b9a5123d1b0acc9bbe33a3083792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://722e623d70449c90fb388b89511f1e75ed55015ec9caa45d9aaa9ac3d9649778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lqpj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.502683 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wv8fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wv8fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.522222 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.538718 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.538783 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.538803 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.538830 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.538849 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:25Z","lastTransitionTime":"2025-09-30T07:34:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.543755 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.561024 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.575188 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.600563 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.622789 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.642541 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.642622 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.642642 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.642670 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.642689 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:25Z","lastTransitionTime":"2025-09-30T07:34:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.643609 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.664969 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.680582 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.698578 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.716106 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7192e617554ba6b5b9533792f166344fbf9681b63e57c180809e721cc18168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.731909 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943219e9-5457-4767-b29c-cdd155ca3cb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9171c7e60156e3ffabb5954ff097de7c4cb0629967cdddb44f16bf86ae38c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f75ced5ba38de06feef6d8addd42a76b6b14cab6a67a351432fdb580eb0ca3c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ead42377ef36b0c38401509b29432dd9b6fdf73cc39f2dbea0930415492a4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f5de75ed9124ec7a6cd0d20862001c06327d05876de38fdbb1270a9e9e8df54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f5de75ed9124ec7a6cd0d20862001c06327d05876de38fdbb1270a9e9e8df54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.745998 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.746056 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.746075 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.746100 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.746122 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:25Z","lastTransitionTime":"2025-09-30T07:34:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.755587 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84d165d7a40638496b6759c4c2c265fdf742b2c9c968c503352f468f776b7d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b84d165d7a40638496b6759c4c2c265fdf742b2c9c968c503352f468f776b7d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"message\\\":\\\"ctory.go:160\\\\nI0930 07:34:22.231404 6358 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 07:34:22.232586 6358 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 07:34:22.232624 6358 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 07:34:22.232630 6358 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 07:34:22.232651 6358 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 07:34:22.232660 6358 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 07:34:22.232662 6358 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 07:34:22.232664 6358 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 07:34:22.232685 6358 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 07:34:22.232682 6358 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 07:34:22.232709 6358 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 07:34:22.232740 6358 factory.go:656] Stopping watch factory\\\\nI0930 07:34:22.232758 6358 ovnkube.go:599] Stopped ovnkube\\\\nI0930 07:34:22.232760 6358 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 07:34:2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sspvl_openshift-ovn-kubernetes(2c4ca8ea-a714-40e5-9e10-080aef32237b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:25Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.849908 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.849979 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.849997 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.850020 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.850040 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:25Z","lastTransitionTime":"2025-09-30T07:34:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.953342 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.953415 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.953433 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.953459 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:25 crc kubenswrapper[4760]: I0930 07:34:25.953480 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:25Z","lastTransitionTime":"2025-09-30T07:34:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.055637 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.055718 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.055737 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.055793 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.055871 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:26Z","lastTransitionTime":"2025-09-30T07:34:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.066032 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.066153 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.066194 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:26 crc kubenswrapper[4760]: E0930 07:34:26.066386 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:26 crc kubenswrapper[4760]: E0930 07:34:26.066575 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:26 crc kubenswrapper[4760]: E0930 07:34:26.066753 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.158579 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.158630 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.158649 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.158672 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.158689 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:26Z","lastTransitionTime":"2025-09-30T07:34:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.262238 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.262341 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.262360 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.262382 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.262399 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:26Z","lastTransitionTime":"2025-09-30T07:34:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.365492 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.365563 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.365583 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.365609 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.365628 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:26Z","lastTransitionTime":"2025-09-30T07:34:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.475000 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.475075 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.475094 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.475123 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.475151 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:26Z","lastTransitionTime":"2025-09-30T07:34:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.578971 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.579065 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.579093 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.579124 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.579145 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:26Z","lastTransitionTime":"2025-09-30T07:34:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.681568 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.681638 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.681661 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.681693 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.681713 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:26Z","lastTransitionTime":"2025-09-30T07:34:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.711350 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.711510 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:26 crc kubenswrapper[4760]: E0930 07:34:26.711593 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:34:58.711558631 +0000 UTC m=+84.354465093 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:34:26 crc kubenswrapper[4760]: E0930 07:34:26.711721 4760 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 07:34:26 crc kubenswrapper[4760]: E0930 07:34:26.711822 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 07:34:58.711795187 +0000 UTC m=+84.354701639 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 07:34:26 crc kubenswrapper[4760]: E0930 07:34:26.711838 4760 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 07:34:26 crc kubenswrapper[4760]: E0930 07:34:26.711898 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 07:34:58.711882899 +0000 UTC m=+84.354789341 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.711733 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.784548 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.784616 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.784641 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.784675 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.784699 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:26Z","lastTransitionTime":"2025-09-30T07:34:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.812599 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-metrics-certs\") pod \"network-metrics-daemon-wv8fz\" (UID: \"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\") " pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.812677 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.812769 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:26 crc kubenswrapper[4760]: E0930 07:34:26.812909 4760 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 07:34:26 crc kubenswrapper[4760]: E0930 07:34:26.812958 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 07:34:26 crc kubenswrapper[4760]: E0930 07:34:26.812986 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 07:34:26 crc kubenswrapper[4760]: E0930 07:34:26.813005 4760 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:34:26 crc kubenswrapper[4760]: E0930 07:34:26.813030 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 07:34:26 crc kubenswrapper[4760]: E0930 07:34:26.813088 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 07:34:26 crc kubenswrapper[4760]: E0930 07:34:26.813112 4760 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:34:26 crc kubenswrapper[4760]: E0930 07:34:26.813052 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-metrics-certs podName:ce6dcf25-c8ea-450b-9fc6-9f8aeafde757 nodeName:}" failed. No retries permitted until 2025-09-30 07:34:42.813011448 +0000 UTC m=+68.455917910 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-metrics-certs") pod "network-metrics-daemon-wv8fz" (UID: "ce6dcf25-c8ea-450b-9fc6-9f8aeafde757") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 07:34:26 crc kubenswrapper[4760]: E0930 07:34:26.813206 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 07:34:58.813182862 +0000 UTC m=+84.456089304 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:34:26 crc kubenswrapper[4760]: E0930 07:34:26.813234 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 07:34:58.813219333 +0000 UTC m=+84.456125775 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.887721 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.887767 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.887779 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.887798 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.887810 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:26Z","lastTransitionTime":"2025-09-30T07:34:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.991650 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.991723 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.991751 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.991782 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:26 crc kubenswrapper[4760]: I0930 07:34:26.991805 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:26Z","lastTransitionTime":"2025-09-30T07:34:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.066795 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:27 crc kubenswrapper[4760]: E0930 07:34:27.067037 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.095107 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.095194 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.095213 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.095241 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.095262 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:27Z","lastTransitionTime":"2025-09-30T07:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.199062 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.199146 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.199170 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.199203 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.199227 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:27Z","lastTransitionTime":"2025-09-30T07:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.302340 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.302421 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.302436 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.302460 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.302478 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:27Z","lastTransitionTime":"2025-09-30T07:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.405545 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.405627 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.405646 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.405673 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.405728 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:27Z","lastTransitionTime":"2025-09-30T07:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.508746 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.508830 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.508856 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.508887 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.508912 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:27Z","lastTransitionTime":"2025-09-30T07:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.611964 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.612031 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.612050 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.612078 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.612096 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:27Z","lastTransitionTime":"2025-09-30T07:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.715354 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.715428 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.715452 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.715490 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.715517 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:27Z","lastTransitionTime":"2025-09-30T07:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.818404 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.818474 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.818498 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.818524 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.818543 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:27Z","lastTransitionTime":"2025-09-30T07:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.922503 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.922853 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.923060 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.923240 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:27 crc kubenswrapper[4760]: I0930 07:34:27.923433 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:27Z","lastTransitionTime":"2025-09-30T07:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.026742 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.026815 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.026833 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.026859 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.026878 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:28Z","lastTransitionTime":"2025-09-30T07:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.066361 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.066414 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.066513 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:28 crc kubenswrapper[4760]: E0930 07:34:28.066778 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:28 crc kubenswrapper[4760]: E0930 07:34:28.066785 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:28 crc kubenswrapper[4760]: E0930 07:34:28.067257 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.129527 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.129593 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.129611 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.129636 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.129655 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:28Z","lastTransitionTime":"2025-09-30T07:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.232284 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.232372 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.232386 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.232405 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.232418 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:28Z","lastTransitionTime":"2025-09-30T07:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.335825 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.335891 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.335909 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.335933 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.335951 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:28Z","lastTransitionTime":"2025-09-30T07:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.439749 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.439822 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.439842 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.439870 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.439887 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:28Z","lastTransitionTime":"2025-09-30T07:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.544562 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.544668 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.544692 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.544726 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.544750 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:28Z","lastTransitionTime":"2025-09-30T07:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.648328 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.648395 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.648415 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.648439 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.648457 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:28Z","lastTransitionTime":"2025-09-30T07:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.752905 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.752966 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.753001 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.753029 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.753050 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:28Z","lastTransitionTime":"2025-09-30T07:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.856678 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.856736 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.856755 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.856781 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.856805 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:28Z","lastTransitionTime":"2025-09-30T07:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.959192 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.959256 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.959282 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.959351 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:28 crc kubenswrapper[4760]: I0930 07:34:28.959378 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:28Z","lastTransitionTime":"2025-09-30T07:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.062258 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.062346 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.062365 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.062393 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.062410 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:29Z","lastTransitionTime":"2025-09-30T07:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.065989 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:29 crc kubenswrapper[4760]: E0930 07:34:29.066173 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.165788 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.165849 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.165872 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.165903 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.165928 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:29Z","lastTransitionTime":"2025-09-30T07:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.269393 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.269471 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.269490 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.269516 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.269535 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:29Z","lastTransitionTime":"2025-09-30T07:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.372731 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.373178 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.373355 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.373523 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.373693 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:29Z","lastTransitionTime":"2025-09-30T07:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.477399 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.477449 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.477466 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.477489 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.477512 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:29Z","lastTransitionTime":"2025-09-30T07:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.580208 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.580642 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.580794 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.580953 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.581100 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:29Z","lastTransitionTime":"2025-09-30T07:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.684666 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.684752 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.684773 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.684797 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.684814 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:29Z","lastTransitionTime":"2025-09-30T07:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.787991 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.788059 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.788076 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.788100 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.788119 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:29Z","lastTransitionTime":"2025-09-30T07:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.891695 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.891784 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.891803 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.891827 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.891846 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:29Z","lastTransitionTime":"2025-09-30T07:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.995239 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.995334 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.995354 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.995378 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:29 crc kubenswrapper[4760]: I0930 07:34:29.995396 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:29Z","lastTransitionTime":"2025-09-30T07:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.066703 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:30 crc kubenswrapper[4760]: E0930 07:34:30.066924 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.066742 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:30 crc kubenswrapper[4760]: E0930 07:34:30.067183 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.066717 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:30 crc kubenswrapper[4760]: E0930 07:34:30.067371 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.098598 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.098652 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.098676 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.098705 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.098728 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:30Z","lastTransitionTime":"2025-09-30T07:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.201856 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.201919 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.201938 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.201961 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.201980 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:30Z","lastTransitionTime":"2025-09-30T07:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.305827 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.305906 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.305926 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.305953 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.305973 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:30Z","lastTransitionTime":"2025-09-30T07:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.409284 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.409381 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.409399 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.409425 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.409445 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:30Z","lastTransitionTime":"2025-09-30T07:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.512071 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.512160 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.512183 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.512217 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.512243 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:30Z","lastTransitionTime":"2025-09-30T07:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.615435 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.615496 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.615515 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.615539 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.615557 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:30Z","lastTransitionTime":"2025-09-30T07:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.718542 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.718593 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.718610 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.718632 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.718649 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:30Z","lastTransitionTime":"2025-09-30T07:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.822418 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.822499 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.822524 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.822554 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.822581 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:30Z","lastTransitionTime":"2025-09-30T07:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.925292 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.925383 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.925402 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.925427 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:30 crc kubenswrapper[4760]: I0930 07:34:30.925445 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:30Z","lastTransitionTime":"2025-09-30T07:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.028982 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.029066 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.029091 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.029123 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.029145 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:31Z","lastTransitionTime":"2025-09-30T07:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.092771 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.092839 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:31 crc kubenswrapper[4760]: E0930 07:34:31.093053 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:34:31 crc kubenswrapper[4760]: E0930 07:34:31.093156 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.132594 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.132625 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.132641 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.132661 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.132675 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:31Z","lastTransitionTime":"2025-09-30T07:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.236446 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.236511 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.236535 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.236568 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.236590 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:31Z","lastTransitionTime":"2025-09-30T07:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.340910 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.340973 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.340990 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.341014 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.341033 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:31Z","lastTransitionTime":"2025-09-30T07:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.444450 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.444509 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.444530 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.444561 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.444583 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:31Z","lastTransitionTime":"2025-09-30T07:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.547533 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.547616 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.547639 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.547666 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.547687 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:31Z","lastTransitionTime":"2025-09-30T07:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.650791 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.650862 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.650882 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.650912 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.650933 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:31Z","lastTransitionTime":"2025-09-30T07:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.754146 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.754265 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.754391 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.755777 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.755832 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:31Z","lastTransitionTime":"2025-09-30T07:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.858954 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.859073 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.859095 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.859120 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.859137 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:31Z","lastTransitionTime":"2025-09-30T07:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.962220 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.962261 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.962275 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.962338 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:31 crc kubenswrapper[4760]: I0930 07:34:31.962357 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:31Z","lastTransitionTime":"2025-09-30T07:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.065888 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.065922 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.065952 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.065935 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:32 crc kubenswrapper[4760]: E0930 07:34:32.066082 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.066114 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.066134 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.066147 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:32Z","lastTransitionTime":"2025-09-30T07:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:32 crc kubenswrapper[4760]: E0930 07:34:32.066301 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.169216 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.169349 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.169368 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.169392 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.169410 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:32Z","lastTransitionTime":"2025-09-30T07:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.272926 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.272993 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.273012 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.273036 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.273054 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:32Z","lastTransitionTime":"2025-09-30T07:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.375475 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.375540 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.375559 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.375585 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.375603 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:32Z","lastTransitionTime":"2025-09-30T07:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.478915 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.478974 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.478990 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.479013 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.479030 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:32Z","lastTransitionTime":"2025-09-30T07:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.493901 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.493981 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.494254 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.494295 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.494363 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:32Z","lastTransitionTime":"2025-09-30T07:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:32 crc kubenswrapper[4760]: E0930 07:34:32.518865 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:32Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.525559 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.525650 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.525669 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.525723 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.525742 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:32Z","lastTransitionTime":"2025-09-30T07:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:32 crc kubenswrapper[4760]: E0930 07:34:32.547238 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:32Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.552707 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.552769 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.552791 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.552821 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.552838 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:32Z","lastTransitionTime":"2025-09-30T07:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:32 crc kubenswrapper[4760]: E0930 07:34:32.574628 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:32Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.580283 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.580401 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.580427 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.580460 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.580482 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:32Z","lastTransitionTime":"2025-09-30T07:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:32 crc kubenswrapper[4760]: E0930 07:34:32.604657 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:32Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.610115 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.610175 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.610195 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.610219 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.610237 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:32Z","lastTransitionTime":"2025-09-30T07:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:32 crc kubenswrapper[4760]: E0930 07:34:32.631110 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:32Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:32 crc kubenswrapper[4760]: E0930 07:34:32.631376 4760 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.633297 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.633387 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.633411 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.633440 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.633461 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:32Z","lastTransitionTime":"2025-09-30T07:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.736782 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.736858 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.736884 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.736912 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.736933 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:32Z","lastTransitionTime":"2025-09-30T07:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.840551 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.840622 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.840639 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.840665 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.840686 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:32Z","lastTransitionTime":"2025-09-30T07:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.944003 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.944081 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.944105 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.944135 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:32 crc kubenswrapper[4760]: I0930 07:34:32.944156 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:32Z","lastTransitionTime":"2025-09-30T07:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.052473 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.052566 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.052595 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.052628 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.052648 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:33Z","lastTransitionTime":"2025-09-30T07:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.066060 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.066119 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:33 crc kubenswrapper[4760]: E0930 07:34:33.066270 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:33 crc kubenswrapper[4760]: E0930 07:34:33.066449 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.156001 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.156058 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.156076 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.156099 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.156117 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:33Z","lastTransitionTime":"2025-09-30T07:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.259225 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.259275 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.259290 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.259354 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.259379 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:33Z","lastTransitionTime":"2025-09-30T07:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.362424 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.362491 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.362514 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.362548 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.362572 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:33Z","lastTransitionTime":"2025-09-30T07:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.465140 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.465198 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.465221 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.465246 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.465264 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:33Z","lastTransitionTime":"2025-09-30T07:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.568505 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.568544 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.568555 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.568571 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.568583 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:33Z","lastTransitionTime":"2025-09-30T07:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.672023 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.672079 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.672098 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.672122 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.672140 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:33Z","lastTransitionTime":"2025-09-30T07:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.775907 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.775964 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.775975 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.775992 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.776005 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:33Z","lastTransitionTime":"2025-09-30T07:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.880560 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.880625 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.880644 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.880672 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.880692 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:33Z","lastTransitionTime":"2025-09-30T07:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.984207 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.984296 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.984358 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.984393 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:33 crc kubenswrapper[4760]: I0930 07:34:33.984419 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:33Z","lastTransitionTime":"2025-09-30T07:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.066110 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.066190 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:34 crc kubenswrapper[4760]: E0930 07:34:34.066358 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:34 crc kubenswrapper[4760]: E0930 07:34:34.066549 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.088054 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.088125 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.088215 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.088338 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.088380 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:34Z","lastTransitionTime":"2025-09-30T07:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.191604 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.191659 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.191679 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.191706 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.191726 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:34Z","lastTransitionTime":"2025-09-30T07:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.294768 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.294847 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.294871 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.294903 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.294929 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:34Z","lastTransitionTime":"2025-09-30T07:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.399008 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.399084 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.399105 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.399138 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.399162 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:34Z","lastTransitionTime":"2025-09-30T07:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.502212 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.502259 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.502276 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.502332 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.502351 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:34Z","lastTransitionTime":"2025-09-30T07:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.605518 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.605578 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.605597 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.605624 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.605642 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:34Z","lastTransitionTime":"2025-09-30T07:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.709066 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.709140 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.709160 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.709189 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.709212 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:34Z","lastTransitionTime":"2025-09-30T07:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.812677 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.812750 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.812768 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.812801 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.812822 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:34Z","lastTransitionTime":"2025-09-30T07:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.916693 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.916762 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.916779 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.916806 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:34 crc kubenswrapper[4760]: I0930 07:34:34.916823 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:34Z","lastTransitionTime":"2025-09-30T07:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.026519 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.027192 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.027213 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.027247 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.027284 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:35Z","lastTransitionTime":"2025-09-30T07:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.066258 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.066388 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:35 crc kubenswrapper[4760]: E0930 07:34:35.066692 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:35 crc kubenswrapper[4760]: E0930 07:34:35.067048 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.068561 4760 scope.go:117] "RemoveContainer" containerID="b84d165d7a40638496b6759c4c2c265fdf742b2c9c968c503352f468f776b7d7" Sep 30 07:34:35 crc kubenswrapper[4760]: E0930 07:34:35.069895 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sspvl_openshift-ovn-kubernetes(2c4ca8ea-a714-40e5-9e10-080aef32237b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.089437 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:35Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.114809 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:35Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.130294 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.130353 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.130366 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.130385 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.130397 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:35Z","lastTransitionTime":"2025-09-30T07:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.133680 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:35Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.154611 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:35Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.179379 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:35Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.203676 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:35Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.226848 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:35Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.233091 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.233129 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.233141 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.233158 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.233170 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:35Z","lastTransitionTime":"2025-09-30T07:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.246865 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:35Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.266792 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:35Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.283840 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7192e617554ba6b5b9533792f166344fbf9681b63e57c180809e721cc18168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:35Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.321611 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:35Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.335927 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.336057 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.336077 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.336104 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.336122 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:35Z","lastTransitionTime":"2025-09-30T07:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.347469 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84d165d7a40638496b6759c4c2c265fdf742b2c9c968c503352f468f776b7d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b84d165d7a40638496b6759c4c2c265fdf742b2c9c968c503352f468f776b7d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"message\\\":\\\"ctory.go:160\\\\nI0930 07:34:22.231404 6358 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 07:34:22.232586 6358 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 07:34:22.232624 6358 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 07:34:22.232630 6358 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 07:34:22.232651 6358 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 07:34:22.232660 6358 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 07:34:22.232662 6358 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 07:34:22.232664 6358 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 07:34:22.232685 6358 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 07:34:22.232682 6358 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 07:34:22.232709 6358 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 07:34:22.232740 6358 factory.go:656] Stopping watch factory\\\\nI0930 07:34:22.232758 6358 ovnkube.go:599] Stopped ovnkube\\\\nI0930 07:34:22.232760 6358 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 07:34:2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sspvl_openshift-ovn-kubernetes(2c4ca8ea-a714-40e5-9e10-080aef32237b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:35Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.364734 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943219e9-5457-4767-b29c-cdd155ca3cb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9171c7e60156e3ffabb5954ff097de7c4cb0629967cdddb44f16bf86ae38c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f75ced5ba38de06feef6d8addd42a76b6b14cab6a67a351432fdb580eb0ca3c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ead42377ef36b0c38401509b29432dd9b6fdf73cc39f2dbea0930415492a4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f5de75ed9124ec7a6cd0d20862001c06327d05876de38fdbb1270a9e9e8df54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f5de75ed9124ec7a6cd0d20862001c06327d05876de38fdbb1270a9e9e8df54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:35Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.386716 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:35Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.406055 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:35Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.422657 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:35Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.439372 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.439442 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.439461 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.439489 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.439508 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:35Z","lastTransitionTime":"2025-09-30T07:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.441779 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a33e356ac5e20ea894804e56d62f6220b8b9a5123d1b0acc9bbe33a3083792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://722e623d70449c90fb388b89511f1e75ed55015ec9caa45d9aaa9ac3d9649778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lqpj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:35Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.464905 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wv8fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wv8fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:35Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.543236 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.543534 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.543688 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.543849 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.543987 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:35Z","lastTransitionTime":"2025-09-30T07:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.647547 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.647622 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.647642 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.647672 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.647696 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:35Z","lastTransitionTime":"2025-09-30T07:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.751255 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.751710 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.751817 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.751928 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.752020 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:35Z","lastTransitionTime":"2025-09-30T07:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.855764 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.855817 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.855829 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.855847 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.855857 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:35Z","lastTransitionTime":"2025-09-30T07:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.958275 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.958363 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.958381 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.958406 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:35 crc kubenswrapper[4760]: I0930 07:34:35.958425 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:35Z","lastTransitionTime":"2025-09-30T07:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.060878 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.060944 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.060965 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.060990 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.061008 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:36Z","lastTransitionTime":"2025-09-30T07:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.066471 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.066529 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:36 crc kubenswrapper[4760]: E0930 07:34:36.066634 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:36 crc kubenswrapper[4760]: E0930 07:34:36.066808 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.163516 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.163554 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.163563 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.163578 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.163588 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:36Z","lastTransitionTime":"2025-09-30T07:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.268004 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.268063 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.268084 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.268110 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.268183 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:36Z","lastTransitionTime":"2025-09-30T07:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.371730 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.371798 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.371819 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.371847 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.371868 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:36Z","lastTransitionTime":"2025-09-30T07:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.475161 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.475229 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.475246 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.475271 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.475292 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:36Z","lastTransitionTime":"2025-09-30T07:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.578579 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.578647 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.578667 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.578693 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.578712 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:36Z","lastTransitionTime":"2025-09-30T07:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.682449 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.682513 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.682530 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.682556 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.682574 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:36Z","lastTransitionTime":"2025-09-30T07:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.786260 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.786366 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.786390 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.786424 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.786447 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:36Z","lastTransitionTime":"2025-09-30T07:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.890178 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.890660 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.890867 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.891010 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.891154 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:36Z","lastTransitionTime":"2025-09-30T07:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.994504 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.994570 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.994589 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.994613 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:36 crc kubenswrapper[4760]: I0930 07:34:36.994632 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:36Z","lastTransitionTime":"2025-09-30T07:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.066988 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.067065 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:37 crc kubenswrapper[4760]: E0930 07:34:37.067172 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:34:37 crc kubenswrapper[4760]: E0930 07:34:37.067294 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.097989 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.098052 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.098070 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.098121 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.098141 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:37Z","lastTransitionTime":"2025-09-30T07:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.201940 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.201994 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.202015 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.202038 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.202056 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:37Z","lastTransitionTime":"2025-09-30T07:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.305434 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.305483 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.305500 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.305525 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.305543 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:37Z","lastTransitionTime":"2025-09-30T07:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.409080 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.409152 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.409171 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.409197 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.409217 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:37Z","lastTransitionTime":"2025-09-30T07:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.513330 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.513403 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.513430 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.513460 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.513482 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:37Z","lastTransitionTime":"2025-09-30T07:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.616198 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.616235 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.616248 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.616265 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.616276 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:37Z","lastTransitionTime":"2025-09-30T07:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.719657 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.719717 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.719735 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.719760 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.719779 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:37Z","lastTransitionTime":"2025-09-30T07:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.823416 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.823472 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.823484 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.823515 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.823531 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:37Z","lastTransitionTime":"2025-09-30T07:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.926067 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.926132 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.926149 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.926174 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:37 crc kubenswrapper[4760]: I0930 07:34:37.926191 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:37Z","lastTransitionTime":"2025-09-30T07:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.028915 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.028966 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.028983 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.029005 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.029021 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:38Z","lastTransitionTime":"2025-09-30T07:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.066804 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:38 crc kubenswrapper[4760]: E0930 07:34:38.066961 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.067036 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:38 crc kubenswrapper[4760]: E0930 07:34:38.067117 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.132547 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.132591 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.132607 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.132628 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.132645 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:38Z","lastTransitionTime":"2025-09-30T07:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.235199 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.235245 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.235262 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.235284 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.235356 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:38Z","lastTransitionTime":"2025-09-30T07:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.338289 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.338380 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.338399 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.338423 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.338440 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:38Z","lastTransitionTime":"2025-09-30T07:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.441494 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.441560 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.441578 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.441601 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.441635 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:38Z","lastTransitionTime":"2025-09-30T07:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.544977 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.545066 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.545093 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.545129 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.545152 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:38Z","lastTransitionTime":"2025-09-30T07:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.648527 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.648577 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.648594 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.648619 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.648636 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:38Z","lastTransitionTime":"2025-09-30T07:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.751729 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.751781 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.751794 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.751816 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.751828 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:38Z","lastTransitionTime":"2025-09-30T07:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.854828 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.854888 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.854908 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.854933 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.854950 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:38Z","lastTransitionTime":"2025-09-30T07:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.957883 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.957915 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.957926 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.957939 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:38 crc kubenswrapper[4760]: I0930 07:34:38.957950 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:38Z","lastTransitionTime":"2025-09-30T07:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.061452 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.061526 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.061546 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.061574 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.061592 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:39Z","lastTransitionTime":"2025-09-30T07:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.066847 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.066878 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:39 crc kubenswrapper[4760]: E0930 07:34:39.067049 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:39 crc kubenswrapper[4760]: E0930 07:34:39.067219 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.164204 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.164252 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.164262 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.164279 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.164291 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:39Z","lastTransitionTime":"2025-09-30T07:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.267757 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.267793 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.267802 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.267817 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.267829 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:39Z","lastTransitionTime":"2025-09-30T07:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.371116 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.371153 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.371162 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.371177 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.371187 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:39Z","lastTransitionTime":"2025-09-30T07:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.473421 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.473504 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.473529 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.473559 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.473583 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:39Z","lastTransitionTime":"2025-09-30T07:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.577048 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.577116 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.577134 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.577161 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.577178 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:39Z","lastTransitionTime":"2025-09-30T07:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.680363 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.680418 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.680429 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.680445 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.680457 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:39Z","lastTransitionTime":"2025-09-30T07:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.782692 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.782748 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.782758 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.782772 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.782781 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:39Z","lastTransitionTime":"2025-09-30T07:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.884602 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.884649 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.884658 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.884697 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.884712 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:39Z","lastTransitionTime":"2025-09-30T07:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.987780 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.987819 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.987831 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.987846 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:39 crc kubenswrapper[4760]: I0930 07:34:39.987881 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:39Z","lastTransitionTime":"2025-09-30T07:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.066805 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.066856 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:40 crc kubenswrapper[4760]: E0930 07:34:40.067016 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:40 crc kubenswrapper[4760]: E0930 07:34:40.067192 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.090330 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.090364 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.090372 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.090386 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.090396 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:40Z","lastTransitionTime":"2025-09-30T07:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.192896 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.192963 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.192980 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.193007 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.193019 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:40Z","lastTransitionTime":"2025-09-30T07:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.296090 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.296140 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.296153 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.296173 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.296185 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:40Z","lastTransitionTime":"2025-09-30T07:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.399176 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.399387 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.399409 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.399436 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.399453 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:40Z","lastTransitionTime":"2025-09-30T07:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.501893 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.501971 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.501988 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.502461 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.502521 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:40Z","lastTransitionTime":"2025-09-30T07:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.605563 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.605609 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.605626 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.605648 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.605665 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:40Z","lastTransitionTime":"2025-09-30T07:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.708133 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.708176 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.708190 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.708210 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.708222 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:40Z","lastTransitionTime":"2025-09-30T07:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.811488 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.811542 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.811558 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.811778 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.811805 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:40Z","lastTransitionTime":"2025-09-30T07:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.914941 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.914987 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.915003 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.915028 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:40 crc kubenswrapper[4760]: I0930 07:34:40.915045 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:40Z","lastTransitionTime":"2025-09-30T07:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.017366 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.017403 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.017419 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.017440 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.017455 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:41Z","lastTransitionTime":"2025-09-30T07:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.066948 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.066922 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:41 crc kubenswrapper[4760]: E0930 07:34:41.067183 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:34:41 crc kubenswrapper[4760]: E0930 07:34:41.067476 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.120582 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.120629 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.120640 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.120658 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.120883 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:41Z","lastTransitionTime":"2025-09-30T07:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.223829 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.223885 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.223904 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.223926 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.223943 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:41Z","lastTransitionTime":"2025-09-30T07:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.327257 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.327382 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.327407 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.327443 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.327467 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:41Z","lastTransitionTime":"2025-09-30T07:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.431037 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.431092 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.431108 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.431130 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.431146 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:41Z","lastTransitionTime":"2025-09-30T07:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.534061 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.534132 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.534150 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.534176 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.534194 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:41Z","lastTransitionTime":"2025-09-30T07:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.637154 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.637203 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.637263 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.637291 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.637331 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:41Z","lastTransitionTime":"2025-09-30T07:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.740939 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.741046 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.741066 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.741097 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.741117 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:41Z","lastTransitionTime":"2025-09-30T07:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.844376 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.844451 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.844473 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.844507 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.844527 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:41Z","lastTransitionTime":"2025-09-30T07:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.947637 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.947689 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.947699 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.947715 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:41 crc kubenswrapper[4760]: I0930 07:34:41.947726 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:41Z","lastTransitionTime":"2025-09-30T07:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.050399 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.050441 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.050453 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.050469 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.050479 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:42Z","lastTransitionTime":"2025-09-30T07:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.065863 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.065894 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:42 crc kubenswrapper[4760]: E0930 07:34:42.065980 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:42 crc kubenswrapper[4760]: E0930 07:34:42.066079 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.153409 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.153434 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.153452 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.153464 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.153473 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:42Z","lastTransitionTime":"2025-09-30T07:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.279119 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.279142 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.279151 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.279162 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.279171 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:42Z","lastTransitionTime":"2025-09-30T07:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.381655 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.381688 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.381697 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.381710 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.381719 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:42Z","lastTransitionTime":"2025-09-30T07:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.484314 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.484341 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.484351 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.484362 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.484375 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:42Z","lastTransitionTime":"2025-09-30T07:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.586842 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.586915 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.586936 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.586971 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.586991 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:42Z","lastTransitionTime":"2025-09-30T07:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.643688 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.643749 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.643768 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.643793 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.643811 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:42Z","lastTransitionTime":"2025-09-30T07:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:42 crc kubenswrapper[4760]: E0930 07:34:42.663003 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:42Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.670799 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.670860 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.670889 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.670918 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.670939 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:42Z","lastTransitionTime":"2025-09-30T07:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:42 crc kubenswrapper[4760]: E0930 07:34:42.691170 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:42Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.696013 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.696066 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.696085 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.696110 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.696139 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:42Z","lastTransitionTime":"2025-09-30T07:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:42 crc kubenswrapper[4760]: E0930 07:34:42.713983 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:42Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.718524 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.718568 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.718577 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.718594 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.718604 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:42Z","lastTransitionTime":"2025-09-30T07:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:42 crc kubenswrapper[4760]: E0930 07:34:42.734984 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:42Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.739043 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.739096 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.739114 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.739139 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.739155 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:42Z","lastTransitionTime":"2025-09-30T07:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:42 crc kubenswrapper[4760]: E0930 07:34:42.758194 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:42Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:42 crc kubenswrapper[4760]: E0930 07:34:42.758440 4760 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.760843 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.761073 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.761236 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.761450 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.761590 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:42Z","lastTransitionTime":"2025-09-30T07:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.833670 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-metrics-certs\") pod \"network-metrics-daemon-wv8fz\" (UID: \"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\") " pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:42 crc kubenswrapper[4760]: E0930 07:34:42.833842 4760 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 07:34:42 crc kubenswrapper[4760]: E0930 07:34:42.834261 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-metrics-certs podName:ce6dcf25-c8ea-450b-9fc6-9f8aeafde757 nodeName:}" failed. No retries permitted until 2025-09-30 07:35:14.834227922 +0000 UTC m=+100.477134364 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-metrics-certs") pod "network-metrics-daemon-wv8fz" (UID: "ce6dcf25-c8ea-450b-9fc6-9f8aeafde757") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.864170 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.864207 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.864217 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.864235 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.864247 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:42Z","lastTransitionTime":"2025-09-30T07:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.967356 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.967402 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.967411 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.967427 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:42 crc kubenswrapper[4760]: I0930 07:34:42.967438 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:42Z","lastTransitionTime":"2025-09-30T07:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.066583 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.066636 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:43 crc kubenswrapper[4760]: E0930 07:34:43.067374 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:43 crc kubenswrapper[4760]: E0930 07:34:43.067426 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.069842 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.069875 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.069885 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.069898 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.069907 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:43Z","lastTransitionTime":"2025-09-30T07:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.173449 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.173550 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.173569 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.173630 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.173651 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:43Z","lastTransitionTime":"2025-09-30T07:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.276505 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.276567 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.276586 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.276611 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.276629 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:43Z","lastTransitionTime":"2025-09-30T07:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.379681 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.379726 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.379735 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.379749 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.379761 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:43Z","lastTransitionTime":"2025-09-30T07:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.482590 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.482646 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.482663 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.482689 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.482709 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:43Z","lastTransitionTime":"2025-09-30T07:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.586158 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.586216 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.586240 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.586268 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.586290 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:43Z","lastTransitionTime":"2025-09-30T07:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.689775 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.689870 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.689889 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.689911 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.689933 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:43Z","lastTransitionTime":"2025-09-30T07:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.792482 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.792547 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.792563 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.792588 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.792615 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:43Z","lastTransitionTime":"2025-09-30T07:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.895151 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.895189 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.895199 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.895213 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.895223 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:43Z","lastTransitionTime":"2025-09-30T07:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.998231 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.998589 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.998601 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.998617 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:43 crc kubenswrapper[4760]: I0930 07:34:43.998626 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:43Z","lastTransitionTime":"2025-09-30T07:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.066920 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:44 crc kubenswrapper[4760]: E0930 07:34:44.067045 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.066919 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:44 crc kubenswrapper[4760]: E0930 07:34:44.067268 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.100888 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.100939 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.100958 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.100984 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.101018 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:44Z","lastTransitionTime":"2025-09-30T07:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.203874 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.203918 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.203927 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.203945 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.203955 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:44Z","lastTransitionTime":"2025-09-30T07:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.306513 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.306562 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.306573 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.306594 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.306607 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:44Z","lastTransitionTime":"2025-09-30T07:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.408832 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.408868 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.408880 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.408895 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.408907 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:44Z","lastTransitionTime":"2025-09-30T07:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.511755 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.511820 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.511838 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.511863 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.511880 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:44Z","lastTransitionTime":"2025-09-30T07:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.541713 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lvdpk_f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e/kube-multus/0.log" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.541796 4760 generic.go:334] "Generic (PLEG): container finished" podID="f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e" containerID="96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014" exitCode=1 Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.541839 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lvdpk" event={"ID":"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e","Type":"ContainerDied","Data":"96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014"} Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.542509 4760 scope.go:117] "RemoveContainer" containerID="96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.559983 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:44Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.582653 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:44Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.598773 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:43Z\\\",\\\"message\\\":\\\"2025-09-30T07:33:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f7972461-b57d-4fae-b02a-2e738164e9c7\\\\n2025-09-30T07:33:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f7972461-b57d-4fae-b02a-2e738164e9c7 to /host/opt/cni/bin/\\\\n2025-09-30T07:33:58Z [verbose] multus-daemon started\\\\n2025-09-30T07:33:58Z [verbose] Readiness Indicator file check\\\\n2025-09-30T07:34:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:44Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.611433 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:44Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.615431 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.615493 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.615508 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.615531 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.615547 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:44Z","lastTransitionTime":"2025-09-30T07:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.634390 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:44Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.656791 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:44Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.671819 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:44Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.686095 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:44Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.699157 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:44Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.713094 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:44Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.717332 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.717363 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.717371 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.717384 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.717395 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:44Z","lastTransitionTime":"2025-09-30T07:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.734753 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7192e617554ba6b5b9533792f166344fbf9681b63e57c180809e721cc18168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:44Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.751414 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943219e9-5457-4767-b29c-cdd155ca3cb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9171c7e60156e3ffabb5954ff097de7c4cb0629967cdddb44f16bf86ae38c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f75ced5ba38de06feef6d8addd42a76b6b14cab6a67a351432fdb580eb0ca3c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ead42377ef36b0c38401509b29432dd9b6fdf73cc39f2dbea0930415492a4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f5de75ed9124ec7a6cd0d20862001c06327d05876de38fdbb1270a9e9e8df54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f5de75ed9124ec7a6cd0d20862001c06327d05876de38fdbb1270a9e9e8df54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:44Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.778811 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84d165d7a40638496b6759c4c2c265fdf742b2c9c968c503352f468f776b7d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b84d165d7a40638496b6759c4c2c265fdf742b2c9c968c503352f468f776b7d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"message\\\":\\\"ctory.go:160\\\\nI0930 07:34:22.231404 6358 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 07:34:22.232586 6358 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 07:34:22.232624 6358 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 07:34:22.232630 6358 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 07:34:22.232651 6358 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 07:34:22.232660 6358 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 07:34:22.232662 6358 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 07:34:22.232664 6358 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 07:34:22.232685 6358 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 07:34:22.232682 6358 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 07:34:22.232709 6358 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 07:34:22.232740 6358 factory.go:656] Stopping watch factory\\\\nI0930 07:34:22.232758 6358 ovnkube.go:599] Stopped ovnkube\\\\nI0930 07:34:22.232760 6358 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 07:34:2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sspvl_openshift-ovn-kubernetes(2c4ca8ea-a714-40e5-9e10-080aef32237b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:44Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.800028 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:44Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.817882 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:44Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.819574 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.819624 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.819636 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.819652 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.819662 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:44Z","lastTransitionTime":"2025-09-30T07:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.831110 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:44Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.844533 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a33e356ac5e20ea894804e56d62f6220b8b9a5123d1b0acc9bbe33a3083792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://722e623d70449c90fb388b89511f1e75ed55015ec9caa45d9aaa9ac3d9649778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lqpj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:44Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.857564 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wv8fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wv8fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:44Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.922407 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.922469 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.922480 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.922498 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:44 crc kubenswrapper[4760]: I0930 07:34:44.922511 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:44Z","lastTransitionTime":"2025-09-30T07:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.024780 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.024817 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.024827 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.024841 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.024852 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:45Z","lastTransitionTime":"2025-09-30T07:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.065863 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.065900 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:45 crc kubenswrapper[4760]: E0930 07:34:45.065989 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:34:45 crc kubenswrapper[4760]: E0930 07:34:45.066083 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.085291 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943219e9-5457-4767-b29c-cdd155ca3cb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9171c7e60156e3ffabb5954ff097de7c4cb0629967cdddb44f16bf86ae38c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f75ced5ba38de06feef6d8addd42a76b6b14cab6a67a351432fdb580eb0ca3c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ead42377ef36b0c38401509b29432dd9b6fdf73cc39f2dbea0930415492a4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f5de75ed9124ec7a6cd0d20862001c06327d05876de38fdbb1270a9e9e8df54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f5de75ed9124ec7a6cd0d20862001c06327d05876de38fdbb1270a9e9e8df54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.115695 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84d165d7a40638496b6759c4c2c265fdf742b2c9c968c503352f468f776b7d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b84d165d7a40638496b6759c4c2c265fdf742b2c9c968c503352f468f776b7d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"message\\\":\\\"ctory.go:160\\\\nI0930 07:34:22.231404 6358 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 07:34:22.232586 6358 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 07:34:22.232624 6358 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 07:34:22.232630 6358 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 07:34:22.232651 6358 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 07:34:22.232660 6358 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 07:34:22.232662 6358 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 07:34:22.232664 6358 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 07:34:22.232685 6358 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 07:34:22.232682 6358 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 07:34:22.232709 6358 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 07:34:22.232740 6358 factory.go:656] Stopping watch factory\\\\nI0930 07:34:22.232758 6358 ovnkube.go:599] Stopped ovnkube\\\\nI0930 07:34:22.232760 6358 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 07:34:2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sspvl_openshift-ovn-kubernetes(2c4ca8ea-a714-40e5-9e10-080aef32237b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.128510 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.128580 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.128598 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.128624 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.128643 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:45Z","lastTransitionTime":"2025-09-30T07:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.140330 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.155920 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.172957 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.192835 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a33e356ac5e20ea894804e56d62f6220b8b9a5123d1b0acc9bbe33a3083792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://722e623d70449c90fb388b89511f1e75ed55015ec9caa45d9aaa9ac3d9649778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lqpj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.207142 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wv8fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wv8fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.223646 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.231130 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.231176 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.231213 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.231231 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.231244 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:45Z","lastTransitionTime":"2025-09-30T07:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.238467 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.256578 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:43Z\\\",\\\"message\\\":\\\"2025-09-30T07:33:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f7972461-b57d-4fae-b02a-2e738164e9c7\\\\n2025-09-30T07:33:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f7972461-b57d-4fae-b02a-2e738164e9c7 to /host/opt/cni/bin/\\\\n2025-09-30T07:33:58Z [verbose] multus-daemon started\\\\n2025-09-30T07:33:58Z [verbose] Readiness Indicator file check\\\\n2025-09-30T07:34:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.269657 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.299139 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.313837 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.332387 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.334569 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.334623 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.334641 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.334664 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.334681 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:45Z","lastTransitionTime":"2025-09-30T07:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.354586 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.372462 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.388248 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.411942 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7192e617554ba6b5b9533792f166344fbf9681b63e57c180809e721cc18168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.437923 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.437968 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.437986 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.438012 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.438031 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:45Z","lastTransitionTime":"2025-09-30T07:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.539813 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.539865 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.539874 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.539890 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.539903 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:45Z","lastTransitionTime":"2025-09-30T07:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.546098 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lvdpk_f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e/kube-multus/0.log" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.546143 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lvdpk" event={"ID":"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e","Type":"ContainerStarted","Data":"0b2f3cfbeb8083c685469a0d988253e1fd9c2403954dda3cc742b87225c82927"} Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.563218 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.580859 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.602340 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2f3cfbeb8083c685469a0d988253e1fd9c2403954dda3cc742b87225c82927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:43Z\\\",\\\"message\\\":\\\"2025-09-30T07:33:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f7972461-b57d-4fae-b02a-2e738164e9c7\\\\n2025-09-30T07:33:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f7972461-b57d-4fae-b02a-2e738164e9c7 to /host/opt/cni/bin/\\\\n2025-09-30T07:33:58Z [verbose] multus-daemon started\\\\n2025-09-30T07:33:58Z [verbose] Readiness Indicator file check\\\\n2025-09-30T07:34:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.619394 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.643737 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.643838 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.643863 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.643898 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.643923 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:45Z","lastTransitionTime":"2025-09-30T07:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.652409 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.673970 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.701004 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.721476 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.736817 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.750130 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.750200 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.750224 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.750253 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.750275 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:45Z","lastTransitionTime":"2025-09-30T07:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.756695 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.780665 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7192e617554ba6b5b9533792f166344fbf9681b63e57c180809e721cc18168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.798189 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943219e9-5457-4767-b29c-cdd155ca3cb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9171c7e60156e3ffabb5954ff097de7c4cb0629967cdddb44f16bf86ae38c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f75ced5ba38de06feef6d8addd42a76b6b14cab6a67a351432fdb580eb0ca3c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ead42377ef36b0c38401509b29432dd9b6fdf73cc39f2dbea0930415492a4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f5de75ed9124ec7a6cd0d20862001c06327d05876de38fdbb1270a9e9e8df54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f5de75ed9124ec7a6cd0d20862001c06327d05876de38fdbb1270a9e9e8df54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.827461 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84d165d7a40638496b6759c4c2c265fdf742b2c9c968c503352f468f776b7d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b84d165d7a40638496b6759c4c2c265fdf742b2c9c968c503352f468f776b7d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"message\\\":\\\"ctory.go:160\\\\nI0930 07:34:22.231404 6358 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 07:34:22.232586 6358 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 07:34:22.232624 6358 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 07:34:22.232630 6358 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 07:34:22.232651 6358 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 07:34:22.232660 6358 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 07:34:22.232662 6358 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 07:34:22.232664 6358 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 07:34:22.232685 6358 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 07:34:22.232682 6358 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 07:34:22.232709 6358 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 07:34:22.232740 6358 factory.go:656] Stopping watch factory\\\\nI0930 07:34:22.232758 6358 ovnkube.go:599] Stopped ovnkube\\\\nI0930 07:34:22.232760 6358 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 07:34:2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sspvl_openshift-ovn-kubernetes(2c4ca8ea-a714-40e5-9e10-080aef32237b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.848536 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.852748 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.852783 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.852795 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.852812 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.852824 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:45Z","lastTransitionTime":"2025-09-30T07:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.864257 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.877187 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.894028 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a33e356ac5e20ea894804e56d62f6220b8b9a5123d1b0acc9bbe33a3083792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://722e623d70449c90fb388b89511f1e75ed55015ec9caa45d9aaa9ac3d9649778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lqpj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.907703 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wv8fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wv8fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:45Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.956152 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.956238 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.956251 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.956269 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:45 crc kubenswrapper[4760]: I0930 07:34:45.956281 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:45Z","lastTransitionTime":"2025-09-30T07:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.058959 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.059027 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.059041 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.059068 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.059085 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:46Z","lastTransitionTime":"2025-09-30T07:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.066185 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.066252 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:46 crc kubenswrapper[4760]: E0930 07:34:46.066340 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:46 crc kubenswrapper[4760]: E0930 07:34:46.066446 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.161818 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.161891 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.161912 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.161942 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.161961 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:46Z","lastTransitionTime":"2025-09-30T07:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.265390 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.265476 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.265505 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.265538 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.265563 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:46Z","lastTransitionTime":"2025-09-30T07:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.368396 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.368452 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.368470 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.368496 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.368512 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:46Z","lastTransitionTime":"2025-09-30T07:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.471281 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.471362 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.471381 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.471429 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.471449 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:46Z","lastTransitionTime":"2025-09-30T07:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.573840 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.573889 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.573898 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.573917 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.573927 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:46Z","lastTransitionTime":"2025-09-30T07:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.676402 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.676456 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.676471 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.676492 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.676505 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:46Z","lastTransitionTime":"2025-09-30T07:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.778804 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.778850 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.778860 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.778879 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.778890 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:46Z","lastTransitionTime":"2025-09-30T07:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.881897 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.881957 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.881971 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.881996 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.882009 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:46Z","lastTransitionTime":"2025-09-30T07:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.984712 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.984791 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.984816 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.984844 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:46 crc kubenswrapper[4760]: I0930 07:34:46.984866 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:46Z","lastTransitionTime":"2025-09-30T07:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.066355 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.066390 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:47 crc kubenswrapper[4760]: E0930 07:34:47.066508 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:47 crc kubenswrapper[4760]: E0930 07:34:47.066654 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.088392 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.088439 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.088453 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.088472 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.088485 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:47Z","lastTransitionTime":"2025-09-30T07:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.191866 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.192525 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.192570 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.192601 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.192620 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:47Z","lastTransitionTime":"2025-09-30T07:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.295764 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.295841 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.295859 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.295892 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.295911 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:47Z","lastTransitionTime":"2025-09-30T07:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.399448 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.399808 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.400001 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.400136 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.400295 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:47Z","lastTransitionTime":"2025-09-30T07:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.504198 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.504281 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.504297 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.504356 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.504372 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:47Z","lastTransitionTime":"2025-09-30T07:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.607530 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.607584 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.607602 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.607623 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.607640 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:47Z","lastTransitionTime":"2025-09-30T07:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.709875 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.709947 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.709964 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.709993 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.710010 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:47Z","lastTransitionTime":"2025-09-30T07:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.812821 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.812881 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.812902 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.812925 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.812946 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:47Z","lastTransitionTime":"2025-09-30T07:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.916289 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.916510 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.916532 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.916573 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:47 crc kubenswrapper[4760]: I0930 07:34:47.916592 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:47Z","lastTransitionTime":"2025-09-30T07:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.019719 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.019788 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.019812 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.019843 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.019866 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:48Z","lastTransitionTime":"2025-09-30T07:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.066230 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.066247 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:48 crc kubenswrapper[4760]: E0930 07:34:48.066662 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:48 crc kubenswrapper[4760]: E0930 07:34:48.067435 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.068040 4760 scope.go:117] "RemoveContainer" containerID="b84d165d7a40638496b6759c4c2c265fdf742b2c9c968c503352f468f776b7d7" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.122577 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.122696 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.122714 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.122739 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.122788 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:48Z","lastTransitionTime":"2025-09-30T07:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.225020 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.225050 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.225062 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.225079 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.225094 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:48Z","lastTransitionTime":"2025-09-30T07:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.335084 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.335146 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.335170 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.335199 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.335222 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:48Z","lastTransitionTime":"2025-09-30T07:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.438139 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.438195 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.438213 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.438235 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.438253 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:48Z","lastTransitionTime":"2025-09-30T07:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.541620 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.541683 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.541708 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.541737 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.541759 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:48Z","lastTransitionTime":"2025-09-30T07:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.558612 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sspvl_2c4ca8ea-a714-40e5-9e10-080aef32237b/ovnkube-controller/2.log" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.562668 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" event={"ID":"2c4ca8ea-a714-40e5-9e10-080aef32237b","Type":"ContainerStarted","Data":"9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912"} Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.564717 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.590861 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943219e9-5457-4767-b29c-cdd155ca3cb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9171c7e60156e3ffabb5954ff097de7c4cb0629967cdddb44f16bf86ae38c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f75ced5ba38de06feef6d8addd42a76b6b14cab6a67a351432fdb580eb0ca3c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ead42377ef36b0c38401509b29432dd9b6fdf73cc39f2dbea0930415492a4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f5de75ed9124ec7a6cd0d20862001c06327d05876de38fdbb1270a9e9e8df54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f5de75ed9124ec7a6cd0d20862001c06327d05876de38fdbb1270a9e9e8df54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:48Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.624578 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b84d165d7a40638496b6759c4c2c265fdf742b2c9c968c503352f468f776b7d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"message\\\":\\\"ctory.go:160\\\\nI0930 07:34:22.231404 6358 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 07:34:22.232586 6358 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 07:34:22.232624 6358 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 07:34:22.232630 6358 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 07:34:22.232651 6358 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 07:34:22.232660 6358 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 07:34:22.232662 6358 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 07:34:22.232664 6358 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 07:34:22.232685 6358 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 07:34:22.232682 6358 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 07:34:22.232709 6358 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 07:34:22.232740 6358 factory.go:656] Stopping watch factory\\\\nI0930 07:34:22.232758 6358 ovnkube.go:599] Stopped ovnkube\\\\nI0930 07:34:22.232760 6358 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 07:34:2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:48Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.645291 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.645362 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.645376 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.645430 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.645447 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:48Z","lastTransitionTime":"2025-09-30T07:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.645600 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wv8fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wv8fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:48Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.670248 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:48Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.686036 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:48Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.698466 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:48Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.711144 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a33e356ac5e20ea894804e56d62f6220b8b9a5123d1b0acc9bbe33a3083792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://722e623d70449c90fb388b89511f1e75ed55015ec9caa45d9aaa9ac3d9649778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lqpj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:48Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.726252 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:48Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.740243 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:48Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.754438 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.754477 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.754487 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.754501 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.754510 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:48Z","lastTransitionTime":"2025-09-30T07:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.760060 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2f3cfbeb8083c685469a0d988253e1fd9c2403954dda3cc742b87225c82927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:43Z\\\",\\\"message\\\":\\\"2025-09-30T07:33:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f7972461-b57d-4fae-b02a-2e738164e9c7\\\\n2025-09-30T07:33:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f7972461-b57d-4fae-b02a-2e738164e9c7 to /host/opt/cni/bin/\\\\n2025-09-30T07:33:58Z [verbose] multus-daemon started\\\\n2025-09-30T07:33:58Z [verbose] Readiness Indicator file check\\\\n2025-09-30T07:34:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:48Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.770972 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:48Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.784621 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:48Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.802792 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7192e617554ba6b5b9533792f166344fbf9681b63e57c180809e721cc18168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:48Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.823813 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:48Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.842985 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:48Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.856813 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:48Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.858167 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.858226 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.858242 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.858289 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.858334 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:48Z","lastTransitionTime":"2025-09-30T07:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.874471 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:48Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.892060 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:48Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.960583 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.960644 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.960661 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.960686 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:48 crc kubenswrapper[4760]: I0930 07:34:48.960704 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:48Z","lastTransitionTime":"2025-09-30T07:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.064702 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.064753 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.064765 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.064783 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.064798 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:49Z","lastTransitionTime":"2025-09-30T07:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.065938 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.065979 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:49 crc kubenswrapper[4760]: E0930 07:34:49.066067 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:49 crc kubenswrapper[4760]: E0930 07:34:49.066144 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.167743 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.167808 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.167825 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.167849 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.167865 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:49Z","lastTransitionTime":"2025-09-30T07:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.271036 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.271088 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.271103 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.271121 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.271136 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:49Z","lastTransitionTime":"2025-09-30T07:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.373452 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.373945 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.374182 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.374385 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.374589 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:49Z","lastTransitionTime":"2025-09-30T07:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.478309 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.478350 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.478359 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.478373 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.478385 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:49Z","lastTransitionTime":"2025-09-30T07:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.571430 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sspvl_2c4ca8ea-a714-40e5-9e10-080aef32237b/ovnkube-controller/3.log" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.572765 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sspvl_2c4ca8ea-a714-40e5-9e10-080aef32237b/ovnkube-controller/2.log" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.577541 4760 generic.go:334] "Generic (PLEG): container finished" podID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerID="9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912" exitCode=1 Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.577589 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" event={"ID":"2c4ca8ea-a714-40e5-9e10-080aef32237b","Type":"ContainerDied","Data":"9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912"} Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.577643 4760 scope.go:117] "RemoveContainer" containerID="b84d165d7a40638496b6759c4c2c265fdf742b2c9c968c503352f468f776b7d7" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.579078 4760 scope.go:117] "RemoveContainer" containerID="9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912" Sep 30 07:34:49 crc kubenswrapper[4760]: E0930 07:34:49.579459 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sspvl_openshift-ovn-kubernetes(2c4ca8ea-a714-40e5-9e10-080aef32237b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.585725 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.585797 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.585814 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.585837 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.585938 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:49Z","lastTransitionTime":"2025-09-30T07:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.601412 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:49Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.617989 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:49Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.635424 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a33e356ac5e20ea894804e56d62f6220b8b9a5123d1b0acc9bbe33a3083792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://722e623d70449c90fb388b89511f1e75ed55015ec9caa45d9aaa9ac3d9649778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lqpj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:49Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.653574 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wv8fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wv8fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:49Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.675513 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:49Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.691387 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.691469 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.691495 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.691527 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.691550 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:49Z","lastTransitionTime":"2025-09-30T07:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.698406 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2f3cfbeb8083c685469a0d988253e1fd9c2403954dda3cc742b87225c82927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:43Z\\\",\\\"message\\\":\\\"2025-09-30T07:33:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f7972461-b57d-4fae-b02a-2e738164e9c7\\\\n2025-09-30T07:33:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f7972461-b57d-4fae-b02a-2e738164e9c7 to /host/opt/cni/bin/\\\\n2025-09-30T07:33:58Z [verbose] multus-daemon started\\\\n2025-09-30T07:33:58Z [verbose] Readiness Indicator file check\\\\n2025-09-30T07:34:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:49Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.714936 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:49Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.733884 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:49Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.756147 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:49Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.779406 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:49Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.794498 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.794578 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.794604 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.794636 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.794657 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:49Z","lastTransitionTime":"2025-09-30T07:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.804289 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:49Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.829727 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:49Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.852532 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:49Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.881114 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7192e617554ba6b5b9533792f166344fbf9681b63e57c180809e721cc18168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:49Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.897545 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.897679 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.897702 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.897732 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.897752 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:49Z","lastTransitionTime":"2025-09-30T07:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.914498 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:49Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.937866 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:49Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.958330 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943219e9-5457-4767-b29c-cdd155ca3cb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9171c7e60156e3ffabb5954ff097de7c4cb0629967cdddb44f16bf86ae38c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f75ced5ba38de06feef6d8addd42a76b6b14cab6a67a351432fdb580eb0ca3c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ead42377ef36b0c38401509b29432dd9b6fdf73cc39f2dbea0930415492a4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f5de75ed9124ec7a6cd0d20862001c06327d05876de38fdbb1270a9e9e8df54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f5de75ed9124ec7a6cd0d20862001c06327d05876de38fdbb1270a9e9e8df54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:49Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:49 crc kubenswrapper[4760]: I0930 07:34:49.988657 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b84d165d7a40638496b6759c4c2c265fdf742b2c9c968c503352f468f776b7d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:22Z\\\",\\\"message\\\":\\\"ctory.go:160\\\\nI0930 07:34:22.231404 6358 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 07:34:22.232586 6358 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 07:34:22.232624 6358 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 07:34:22.232630 6358 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 07:34:22.232651 6358 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 07:34:22.232660 6358 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 07:34:22.232662 6358 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 07:34:22.232664 6358 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 07:34:22.232685 6358 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 07:34:22.232682 6358 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 07:34:22.232709 6358 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 07:34:22.232740 6358 factory.go:656] Stopping watch factory\\\\nI0930 07:34:22.232758 6358 ovnkube.go:599] Stopped ovnkube\\\\nI0930 07:34:22.232760 6358 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 07:34:2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:49Z\\\",\\\"message\\\":\\\"rvice.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0074eb62b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:8443,TargetPort:{1 0 metrics},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: package-server-manager,},ClusterIP:10.217.4.110,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.110],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0930 07:34:49.085075 6715 services_controller.go:454] Service openshift-marketplace/marketplace-operator-metrics for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0930 07:34:49.085082 6715 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:49Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.001617 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.001712 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.001768 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.001801 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.001822 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:50Z","lastTransitionTime":"2025-09-30T07:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.066573 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.066670 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:50 crc kubenswrapper[4760]: E0930 07:34:50.066790 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:50 crc kubenswrapper[4760]: E0930 07:34:50.066868 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.105743 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.105814 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.105832 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.105858 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.105878 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:50Z","lastTransitionTime":"2025-09-30T07:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.208887 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.208949 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.208966 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.208990 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.209006 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:50Z","lastTransitionTime":"2025-09-30T07:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.312695 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.312767 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.312789 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.312822 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.312844 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:50Z","lastTransitionTime":"2025-09-30T07:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.415677 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.415792 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.415825 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.415857 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.415933 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:50Z","lastTransitionTime":"2025-09-30T07:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.519272 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.519361 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.519381 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.519791 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.519837 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:50Z","lastTransitionTime":"2025-09-30T07:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.584667 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sspvl_2c4ca8ea-a714-40e5-9e10-080aef32237b/ovnkube-controller/3.log" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.590967 4760 scope.go:117] "RemoveContainer" containerID="9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912" Sep 30 07:34:50 crc kubenswrapper[4760]: E0930 07:34:50.591412 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sspvl_openshift-ovn-kubernetes(2c4ca8ea-a714-40e5-9e10-080aef32237b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.609678 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943219e9-5457-4767-b29c-cdd155ca3cb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9171c7e60156e3ffabb5954ff097de7c4cb0629967cdddb44f16bf86ae38c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f75ced5ba38de06feef6d8addd42a76b6b14cab6a67a351432fdb580eb0ca3c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ead42377ef36b0c38401509b29432dd9b6fdf73cc39f2dbea0930415492a4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f5de75ed9124ec7a6cd0d20862001c06327d05876de38fdbb1270a9e9e8df54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f5de75ed9124ec7a6cd0d20862001c06327d05876de38fdbb1270a9e9e8df54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:50Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.623491 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.623593 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.623612 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.623640 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.623659 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:50Z","lastTransitionTime":"2025-09-30T07:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.641213 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:49Z\\\",\\\"message\\\":\\\"rvice.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0074eb62b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:8443,TargetPort:{1 0 metrics},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: package-server-manager,},ClusterIP:10.217.4.110,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.110],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0930 07:34:49.085075 6715 services_controller.go:454] Service openshift-marketplace/marketplace-operator-metrics for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0930 07:34:49.085082 6715 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sspvl_openshift-ovn-kubernetes(2c4ca8ea-a714-40e5-9e10-080aef32237b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:50Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.662262 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:50Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.678394 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:50Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.693019 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:50Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.710231 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a33e356ac5e20ea894804e56d62f6220b8b9a5123d1b0acc9bbe33a3083792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://722e623d70449c90fb388b89511f1e75ed55015ec9caa45d9aaa9ac3d9649778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lqpj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:50Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.727608 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.727678 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.727702 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.727732 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.727759 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:50Z","lastTransitionTime":"2025-09-30T07:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.727926 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wv8fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wv8fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:50Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.747151 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:50Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.767091 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:50Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.782691 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2f3cfbeb8083c685469a0d988253e1fd9c2403954dda3cc742b87225c82927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:43Z\\\",\\\"message\\\":\\\"2025-09-30T07:33:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f7972461-b57d-4fae-b02a-2e738164e9c7\\\\n2025-09-30T07:33:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f7972461-b57d-4fae-b02a-2e738164e9c7 to /host/opt/cni/bin/\\\\n2025-09-30T07:33:58Z [verbose] multus-daemon started\\\\n2025-09-30T07:33:58Z [verbose] Readiness Indicator file check\\\\n2025-09-30T07:34:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:50Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.793038 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:50Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.811924 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7192e617554ba6b5b9533792f166344fbf9681b63e57c180809e721cc18168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:50Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.830618 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.830670 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.830681 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.830698 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.830709 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:50Z","lastTransitionTime":"2025-09-30T07:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.846067 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:50Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.858085 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:50Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.869149 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:50Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.880150 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:50Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.893105 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:50Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.906606 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:50Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.934940 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.935012 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.935032 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.935060 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:50 crc kubenswrapper[4760]: I0930 07:34:50.935080 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:50Z","lastTransitionTime":"2025-09-30T07:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.038422 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.038491 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.038515 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.038541 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.038559 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:51Z","lastTransitionTime":"2025-09-30T07:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.066377 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.066421 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:51 crc kubenswrapper[4760]: E0930 07:34:51.066631 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:51 crc kubenswrapper[4760]: E0930 07:34:51.066762 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.141461 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.141558 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.141583 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.141613 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.141636 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:51Z","lastTransitionTime":"2025-09-30T07:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.244490 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.244566 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.244580 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.244600 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.244615 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:51Z","lastTransitionTime":"2025-09-30T07:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.346970 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.347033 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.347046 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.347067 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.347081 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:51Z","lastTransitionTime":"2025-09-30T07:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.450467 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.450520 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.450534 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.450551 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.450561 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:51Z","lastTransitionTime":"2025-09-30T07:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.559194 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.559256 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.559274 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.559322 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.559341 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:51Z","lastTransitionTime":"2025-09-30T07:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.662235 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.662327 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.662347 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.662371 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.662585 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:51Z","lastTransitionTime":"2025-09-30T07:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.765112 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.765173 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.765195 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.765211 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.765223 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:51Z","lastTransitionTime":"2025-09-30T07:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.867877 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.867944 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.867969 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.867998 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.868020 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:51Z","lastTransitionTime":"2025-09-30T07:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.970966 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.971391 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.971507 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.971623 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:51 crc kubenswrapper[4760]: I0930 07:34:51.971736 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:51Z","lastTransitionTime":"2025-09-30T07:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.066887 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.066916 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:52 crc kubenswrapper[4760]: E0930 07:34:52.067675 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:52 crc kubenswrapper[4760]: E0930 07:34:52.067811 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.075388 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.075449 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.075466 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.075488 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.075508 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:52Z","lastTransitionTime":"2025-09-30T07:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.178713 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.178768 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.178785 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.178811 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.178828 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:52Z","lastTransitionTime":"2025-09-30T07:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.282159 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.282226 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.282251 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.282279 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.282342 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:52Z","lastTransitionTime":"2025-09-30T07:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.385453 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.385514 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.385535 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.385560 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.385581 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:52Z","lastTransitionTime":"2025-09-30T07:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.489700 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.490107 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.490217 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.490371 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.490510 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:52Z","lastTransitionTime":"2025-09-30T07:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.594716 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.594783 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.594804 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.594829 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.594847 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:52Z","lastTransitionTime":"2025-09-30T07:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.697142 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.697607 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.697769 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.697927 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.698078 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:52Z","lastTransitionTime":"2025-09-30T07:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.801253 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.801296 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.801330 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.801349 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.801361 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:52Z","lastTransitionTime":"2025-09-30T07:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.905393 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.905449 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.905467 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.905538 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.905562 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:52Z","lastTransitionTime":"2025-09-30T07:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.924240 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.924642 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.924666 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.924693 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.924712 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:52Z","lastTransitionTime":"2025-09-30T07:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:52 crc kubenswrapper[4760]: E0930 07:34:52.942791 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:52Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.947464 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.947500 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.947518 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.947540 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.947560 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:52Z","lastTransitionTime":"2025-09-30T07:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:52 crc kubenswrapper[4760]: E0930 07:34:52.967381 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:52Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.972700 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.972758 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.972775 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.972797 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.972813 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:52Z","lastTransitionTime":"2025-09-30T07:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:52 crc kubenswrapper[4760]: E0930 07:34:52.993041 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:52Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.998994 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.999108 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.999189 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.999273 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:52 crc kubenswrapper[4760]: I0930 07:34:52.999361 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:52Z","lastTransitionTime":"2025-09-30T07:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:53 crc kubenswrapper[4760]: E0930 07:34:53.021877 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:53Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.028742 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.028800 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.028816 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.028840 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.028854 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:53Z","lastTransitionTime":"2025-09-30T07:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:53 crc kubenswrapper[4760]: E0930 07:34:53.052526 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:34:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:53Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:53 crc kubenswrapper[4760]: E0930 07:34:53.052718 4760 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.054640 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.054703 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.054724 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.054749 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.054768 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:53Z","lastTransitionTime":"2025-09-30T07:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.066049 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:53 crc kubenswrapper[4760]: E0930 07:34:53.066424 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.066668 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:53 crc kubenswrapper[4760]: E0930 07:34:53.066882 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.158096 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.158164 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.158187 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.158220 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.158241 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:53Z","lastTransitionTime":"2025-09-30T07:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.261269 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.261326 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.261338 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.261353 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.261364 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:53Z","lastTransitionTime":"2025-09-30T07:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.368942 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.369021 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.369047 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.369079 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.369112 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:53Z","lastTransitionTime":"2025-09-30T07:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.473176 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.473243 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.473260 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.473289 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.473336 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:53Z","lastTransitionTime":"2025-09-30T07:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.576394 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.576457 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.576475 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.576500 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.576518 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:53Z","lastTransitionTime":"2025-09-30T07:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.680138 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.680239 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.680272 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.680340 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.680359 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:53Z","lastTransitionTime":"2025-09-30T07:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.783474 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.783560 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.783588 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.783619 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.783642 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:53Z","lastTransitionTime":"2025-09-30T07:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.888286 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.888445 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.888468 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.888502 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.888528 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:53Z","lastTransitionTime":"2025-09-30T07:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.991629 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.991758 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.991780 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.991805 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:53 crc kubenswrapper[4760]: I0930 07:34:53.991854 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:53Z","lastTransitionTime":"2025-09-30T07:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.066154 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.066358 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:54 crc kubenswrapper[4760]: E0930 07:34:54.066427 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:54 crc kubenswrapper[4760]: E0930 07:34:54.066600 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.095162 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.095224 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.095239 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.095268 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.095282 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:54Z","lastTransitionTime":"2025-09-30T07:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.198226 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.198330 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.198343 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.198363 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.198376 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:54Z","lastTransitionTime":"2025-09-30T07:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.302233 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.302295 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.302339 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.302365 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.302384 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:54Z","lastTransitionTime":"2025-09-30T07:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.405799 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.405855 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.405872 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.405898 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.405916 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:54Z","lastTransitionTime":"2025-09-30T07:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.509040 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.509107 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.509126 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.509151 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.509175 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:54Z","lastTransitionTime":"2025-09-30T07:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.612007 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.612065 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.612078 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.612102 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.612120 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:54Z","lastTransitionTime":"2025-09-30T07:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.715442 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.716103 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.716132 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.716172 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.716201 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:54Z","lastTransitionTime":"2025-09-30T07:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.821229 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.821336 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.821369 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.821402 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.821426 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:54Z","lastTransitionTime":"2025-09-30T07:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.924576 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.924641 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.924662 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.924689 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:54 crc kubenswrapper[4760]: I0930 07:34:54.924707 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:54Z","lastTransitionTime":"2025-09-30T07:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.027531 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.027616 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.027634 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.027661 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.027679 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:55Z","lastTransitionTime":"2025-09-30T07:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.066944 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.066977 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:55 crc kubenswrapper[4760]: E0930 07:34:55.067175 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:55 crc kubenswrapper[4760]: E0930 07:34:55.067582 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.087666 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943219e9-5457-4767-b29c-cdd155ca3cb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9171c7e60156e3ffabb5954ff097de7c4cb0629967cdddb44f16bf86ae38c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f75ced5ba38de06feef6d8addd42a76b6b14cab6a67a351432fdb580eb0ca3c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ead42377ef36b0c38401509b29432dd9b6fdf73cc39f2dbea0930415492a4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f5de75ed9124ec7a6cd0d20862001c06327d05876de38fdbb1270a9e9e8df54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f5de75ed9124ec7a6cd0d20862001c06327d05876de38fdbb1270a9e9e8df54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:55Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.118435 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:49Z\\\",\\\"message\\\":\\\"rvice.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0074eb62b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:8443,TargetPort:{1 0 metrics},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: package-server-manager,},ClusterIP:10.217.4.110,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.110],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0930 07:34:49.085075 6715 services_controller.go:454] Service openshift-marketplace/marketplace-operator-metrics for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0930 07:34:49.085082 6715 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sspvl_openshift-ovn-kubernetes(2c4ca8ea-a714-40e5-9e10-080aef32237b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:55Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.130913 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.130995 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.131017 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.131050 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.131147 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:55Z","lastTransitionTime":"2025-09-30T07:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.141785 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:55Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.162664 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:55Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.179341 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:55Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.202724 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a33e356ac5e20ea894804e56d62f6220b8b9a5123d1b0acc9bbe33a3083792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://722e623d70449c90fb388b89511f1e75ed55015ec9caa45d9aaa9ac3d9649778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lqpj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:55Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.220749 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wv8fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wv8fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:55Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.237268 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.237391 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.237415 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.237483 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.237505 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:55Z","lastTransitionTime":"2025-09-30T07:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.241006 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:55Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.258235 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:55Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.278759 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2f3cfbeb8083c685469a0d988253e1fd9c2403954dda3cc742b87225c82927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:43Z\\\",\\\"message\\\":\\\"2025-09-30T07:33:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f7972461-b57d-4fae-b02a-2e738164e9c7\\\\n2025-09-30T07:33:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f7972461-b57d-4fae-b02a-2e738164e9c7 to /host/opt/cni/bin/\\\\n2025-09-30T07:33:58Z [verbose] multus-daemon started\\\\n2025-09-30T07:33:58Z [verbose] Readiness Indicator file check\\\\n2025-09-30T07:34:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:55Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.302362 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:55Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.336195 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:55Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.342155 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.342242 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.342261 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.342354 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.342384 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:55Z","lastTransitionTime":"2025-09-30T07:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.360831 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:55Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.381423 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:55Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.404251 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:55Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.426451 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:55Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.446733 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.446829 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.446848 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.446909 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.446928 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:55Z","lastTransitionTime":"2025-09-30T07:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.447270 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:55Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.471748 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7192e617554ba6b5b9533792f166344fbf9681b63e57c180809e721cc18168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:34:55Z is after 2025-08-24T17:21:41Z" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.550142 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.550234 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.550263 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.550292 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.550351 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:55Z","lastTransitionTime":"2025-09-30T07:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.653881 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.653928 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.653940 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.653957 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.653969 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:55Z","lastTransitionTime":"2025-09-30T07:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.757228 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.757329 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.757349 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.757375 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.757393 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:55Z","lastTransitionTime":"2025-09-30T07:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.860455 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.860538 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.860560 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.860591 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.860612 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:55Z","lastTransitionTime":"2025-09-30T07:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.963636 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.963696 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.963714 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.963738 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:55 crc kubenswrapper[4760]: I0930 07:34:55.963757 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:55Z","lastTransitionTime":"2025-09-30T07:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.065899 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.065899 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:56 crc kubenswrapper[4760]: E0930 07:34:56.066341 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:56 crc kubenswrapper[4760]: E0930 07:34:56.066510 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.067059 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.067134 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.067152 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.067178 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.067197 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:56Z","lastTransitionTime":"2025-09-30T07:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.171088 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.171160 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.171179 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.171211 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.171232 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:56Z","lastTransitionTime":"2025-09-30T07:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.274636 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.274697 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.274715 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.274766 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.274788 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:56Z","lastTransitionTime":"2025-09-30T07:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.378197 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.378292 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.378337 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.378366 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.378389 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:56Z","lastTransitionTime":"2025-09-30T07:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.481561 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.481624 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.481643 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.481669 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.481685 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:56Z","lastTransitionTime":"2025-09-30T07:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.584797 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.584864 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.584882 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.584907 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.584927 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:56Z","lastTransitionTime":"2025-09-30T07:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.687614 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.687679 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.687698 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.687725 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.687742 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:56Z","lastTransitionTime":"2025-09-30T07:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.791339 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.791424 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.791450 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.791481 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.791503 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:56Z","lastTransitionTime":"2025-09-30T07:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.895233 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.895316 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.895330 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.895347 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.895380 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:56Z","lastTransitionTime":"2025-09-30T07:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.999182 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.999232 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.999244 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.999264 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:56 crc kubenswrapper[4760]: I0930 07:34:56.999280 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:56Z","lastTransitionTime":"2025-09-30T07:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.066263 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.066449 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:57 crc kubenswrapper[4760]: E0930 07:34:57.066489 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:57 crc kubenswrapper[4760]: E0930 07:34:57.066658 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.102957 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.103028 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.103046 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.103073 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.103094 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:57Z","lastTransitionTime":"2025-09-30T07:34:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.205745 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.205822 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.205843 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.205874 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.205893 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:57Z","lastTransitionTime":"2025-09-30T07:34:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.309074 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.309134 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.309151 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.309176 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.309197 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:57Z","lastTransitionTime":"2025-09-30T07:34:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.411600 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.411738 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.411792 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.411825 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.411847 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:57Z","lastTransitionTime":"2025-09-30T07:34:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.515747 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.515811 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.515832 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.515858 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.515876 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:57Z","lastTransitionTime":"2025-09-30T07:34:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.617882 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.617931 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.617944 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.617964 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.617978 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:57Z","lastTransitionTime":"2025-09-30T07:34:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.722188 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.722275 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.722294 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.722358 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.722379 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:57Z","lastTransitionTime":"2025-09-30T07:34:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.825580 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.825639 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.825658 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.825683 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.825701 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:57Z","lastTransitionTime":"2025-09-30T07:34:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.930581 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.930642 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.930659 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.930683 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:57 crc kubenswrapper[4760]: I0930 07:34:57.930701 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:57Z","lastTransitionTime":"2025-09-30T07:34:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.033498 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.033566 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.033582 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.033606 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.033624 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:58Z","lastTransitionTime":"2025-09-30T07:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.066080 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.066178 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:58 crc kubenswrapper[4760]: E0930 07:34:58.066260 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:34:58 crc kubenswrapper[4760]: E0930 07:34:58.066395 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.136662 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.136719 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.136739 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.136770 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.136794 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:58Z","lastTransitionTime":"2025-09-30T07:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.239425 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.239494 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.239512 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.239535 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.239551 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:58Z","lastTransitionTime":"2025-09-30T07:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.342759 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.342809 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.342842 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.342863 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.342875 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:58Z","lastTransitionTime":"2025-09-30T07:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.445921 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.445984 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.446002 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.446029 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.446047 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:58Z","lastTransitionTime":"2025-09-30T07:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.549887 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.549951 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.549966 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.549992 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.550013 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:58Z","lastTransitionTime":"2025-09-30T07:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.653967 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.654038 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.654057 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.654082 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.654100 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:58Z","lastTransitionTime":"2025-09-30T07:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.721550 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:34:58 crc kubenswrapper[4760]: E0930 07:34:58.721813 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:36:02.72178145 +0000 UTC m=+148.364687892 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.721882 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.721976 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:58 crc kubenswrapper[4760]: E0930 07:34:58.722098 4760 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 07:34:58 crc kubenswrapper[4760]: E0930 07:34:58.722098 4760 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 07:34:58 crc kubenswrapper[4760]: E0930 07:34:58.722165 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 07:36:02.72215108 +0000 UTC m=+148.365057522 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 07:34:58 crc kubenswrapper[4760]: E0930 07:34:58.722212 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 07:36:02.722198031 +0000 UTC m=+148.365104483 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.756659 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.756715 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.756735 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.756759 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.756780 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:58Z","lastTransitionTime":"2025-09-30T07:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.823221 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.823346 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:34:58 crc kubenswrapper[4760]: E0930 07:34:58.823557 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 07:34:58 crc kubenswrapper[4760]: E0930 07:34:58.823578 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 07:34:58 crc kubenswrapper[4760]: E0930 07:34:58.823631 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 07:34:58 crc kubenswrapper[4760]: E0930 07:34:58.823589 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 07:34:58 crc kubenswrapper[4760]: E0930 07:34:58.823653 4760 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:34:58 crc kubenswrapper[4760]: E0930 07:34:58.823675 4760 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:34:58 crc kubenswrapper[4760]: E0930 07:34:58.823738 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 07:36:02.823712229 +0000 UTC m=+148.466618671 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:34:58 crc kubenswrapper[4760]: E0930 07:34:58.823771 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 07:36:02.82375807 +0000 UTC m=+148.466664512 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.860036 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.860102 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.860120 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.860143 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.860162 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:58Z","lastTransitionTime":"2025-09-30T07:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.963921 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.964014 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.964034 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.964060 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:58 crc kubenswrapper[4760]: I0930 07:34:58.964078 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:58Z","lastTransitionTime":"2025-09-30T07:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.066391 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.066466 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:34:59 crc kubenswrapper[4760]: E0930 07:34:59.066765 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:34:59 crc kubenswrapper[4760]: E0930 07:34:59.066912 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.067773 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.067817 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.067838 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.067862 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.067881 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:59Z","lastTransitionTime":"2025-09-30T07:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.171554 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.171653 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.171675 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.171736 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.171760 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:59Z","lastTransitionTime":"2025-09-30T07:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.275776 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.275862 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.275882 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.275906 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.275925 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:59Z","lastTransitionTime":"2025-09-30T07:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.379934 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.379987 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.379998 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.380016 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.380051 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:59Z","lastTransitionTime":"2025-09-30T07:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.482921 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.483485 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.483507 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.483532 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.483552 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:59Z","lastTransitionTime":"2025-09-30T07:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.586264 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.586456 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.586474 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.586494 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.586508 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:59Z","lastTransitionTime":"2025-09-30T07:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.690980 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.691071 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.691096 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.691129 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.691163 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:59Z","lastTransitionTime":"2025-09-30T07:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.796552 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.796600 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.796611 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.796630 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.796641 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:59Z","lastTransitionTime":"2025-09-30T07:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.900214 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.900287 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.900326 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.900347 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:34:59 crc kubenswrapper[4760]: I0930 07:34:59.900360 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:34:59Z","lastTransitionTime":"2025-09-30T07:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.003060 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.003173 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.003187 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.003208 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.003221 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:00Z","lastTransitionTime":"2025-09-30T07:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.066721 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.066964 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:35:00 crc kubenswrapper[4760]: E0930 07:35:00.067142 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:35:00 crc kubenswrapper[4760]: E0930 07:35:00.067559 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.106114 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.106201 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.106220 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.106245 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.106263 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:00Z","lastTransitionTime":"2025-09-30T07:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.209465 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.209521 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.209538 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.209564 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.209582 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:00Z","lastTransitionTime":"2025-09-30T07:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.313106 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.313201 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.313227 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.313257 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.313280 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:00Z","lastTransitionTime":"2025-09-30T07:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.417409 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.417472 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.417489 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.417517 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.417534 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:00Z","lastTransitionTime":"2025-09-30T07:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.522202 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.522380 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.522399 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.522815 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.522842 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:00Z","lastTransitionTime":"2025-09-30T07:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.626696 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.626764 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.626781 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.626807 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.626826 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:00Z","lastTransitionTime":"2025-09-30T07:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.730557 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.730710 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.730737 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.730770 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.730791 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:00Z","lastTransitionTime":"2025-09-30T07:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.834288 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.834401 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.834423 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.834449 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.834471 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:00Z","lastTransitionTime":"2025-09-30T07:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.937702 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.937763 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.937780 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.937805 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:00 crc kubenswrapper[4760]: I0930 07:35:00.937823 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:00Z","lastTransitionTime":"2025-09-30T07:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.041376 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.041447 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.041467 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.041493 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.041511 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:01Z","lastTransitionTime":"2025-09-30T07:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.066404 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.066418 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:35:01 crc kubenswrapper[4760]: E0930 07:35:01.066626 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:35:01 crc kubenswrapper[4760]: E0930 07:35:01.066921 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.144179 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.144352 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.144384 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.144415 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.144477 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:01Z","lastTransitionTime":"2025-09-30T07:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.247436 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.247510 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.247531 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.247557 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.247575 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:01Z","lastTransitionTime":"2025-09-30T07:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.351569 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.351631 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.351655 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.351686 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.351709 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:01Z","lastTransitionTime":"2025-09-30T07:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.454716 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.454778 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.454795 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.454821 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.454839 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:01Z","lastTransitionTime":"2025-09-30T07:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.557547 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.557624 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.557647 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.557676 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.557699 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:01Z","lastTransitionTime":"2025-09-30T07:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.660687 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.660892 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.660927 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.660958 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.660993 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:01Z","lastTransitionTime":"2025-09-30T07:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.763727 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.763805 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.763840 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.763869 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.763891 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:01Z","lastTransitionTime":"2025-09-30T07:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.866327 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.866381 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.866393 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.866414 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.866426 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:01Z","lastTransitionTime":"2025-09-30T07:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.969929 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.969980 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.969997 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.970020 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:01 crc kubenswrapper[4760]: I0930 07:35:01.970037 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:01Z","lastTransitionTime":"2025-09-30T07:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.066619 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.066622 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:35:02 crc kubenswrapper[4760]: E0930 07:35:02.066899 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:35:02 crc kubenswrapper[4760]: E0930 07:35:02.067068 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.073363 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.073432 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.073455 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.073483 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.073506 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:02Z","lastTransitionTime":"2025-09-30T07:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.176289 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.177119 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.177221 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.177362 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.177473 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:02Z","lastTransitionTime":"2025-09-30T07:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.280721 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.280796 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.280819 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.280854 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.280878 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:02Z","lastTransitionTime":"2025-09-30T07:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.384172 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.384235 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.384252 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.384279 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.384325 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:02Z","lastTransitionTime":"2025-09-30T07:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.487832 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.487895 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.487912 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.487940 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.487958 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:02Z","lastTransitionTime":"2025-09-30T07:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.591004 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.591109 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.591132 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.591200 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.591222 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:02Z","lastTransitionTime":"2025-09-30T07:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.694430 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.694531 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.694558 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.694626 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.694652 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:02Z","lastTransitionTime":"2025-09-30T07:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.797989 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.798094 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.798121 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.798190 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.798214 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:02Z","lastTransitionTime":"2025-09-30T07:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.901741 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.901805 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.901822 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.901850 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:02 crc kubenswrapper[4760]: I0930 07:35:02.901876 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:02Z","lastTransitionTime":"2025-09-30T07:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.006084 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.006176 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.006217 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.006253 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.006296 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:03Z","lastTransitionTime":"2025-09-30T07:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.066117 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:35:03 crc kubenswrapper[4760]: E0930 07:35:03.066260 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.066971 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.067273 4760 scope.go:117] "RemoveContainer" containerID="9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912" Sep 30 07:35:03 crc kubenswrapper[4760]: E0930 07:35:03.067785 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sspvl_openshift-ovn-kubernetes(2c4ca8ea-a714-40e5-9e10-080aef32237b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" Sep 30 07:35:03 crc kubenswrapper[4760]: E0930 07:35:03.067765 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.109705 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.110032 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.110203 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.110450 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.111133 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:03Z","lastTransitionTime":"2025-09-30T07:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.214868 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.214936 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.214955 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.214981 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.215000 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:03Z","lastTransitionTime":"2025-09-30T07:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.318370 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.318816 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.319093 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.319825 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.320227 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:03Z","lastTransitionTime":"2025-09-30T07:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.389457 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.389517 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.389535 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.389557 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.389573 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:03Z","lastTransitionTime":"2025-09-30T07:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:03 crc kubenswrapper[4760]: E0930 07:35:03.410897 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:35:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.417421 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.417832 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.418116 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.418413 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.418695 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:03Z","lastTransitionTime":"2025-09-30T07:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:03 crc kubenswrapper[4760]: E0930 07:35:03.441880 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:35:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.449820 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.449932 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.449994 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.450021 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.450080 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:03Z","lastTransitionTime":"2025-09-30T07:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:03 crc kubenswrapper[4760]: E0930 07:35:03.473730 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:35:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.479414 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.479485 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.479511 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.479545 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.479566 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:03Z","lastTransitionTime":"2025-09-30T07:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:03 crc kubenswrapper[4760]: E0930 07:35:03.501544 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:35:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.506824 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.506877 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.506897 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.506921 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.506939 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:03Z","lastTransitionTime":"2025-09-30T07:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:03 crc kubenswrapper[4760]: E0930 07:35:03.530745 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T07:35:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c02496b-3bdb-4a64-91fc-57c59208ba25\\\",\\\"systemUUID\\\":\\\"ef0efcac-382e-4544-a77e-6adf149d5981\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:35:03Z is after 2025-08-24T17:21:41Z" Sep 30 07:35:03 crc kubenswrapper[4760]: E0930 07:35:03.531378 4760 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.534925 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.534986 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.535005 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.535036 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.535058 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:03Z","lastTransitionTime":"2025-09-30T07:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.639458 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.639540 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.639587 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.639632 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.639662 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:03Z","lastTransitionTime":"2025-09-30T07:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.743746 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.743849 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.743877 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.743913 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.743941 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:03Z","lastTransitionTime":"2025-09-30T07:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.847552 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.847654 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.847677 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.847745 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.847765 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:03Z","lastTransitionTime":"2025-09-30T07:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.952014 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.952087 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.952104 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.952128 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:03 crc kubenswrapper[4760]: I0930 07:35:03.952146 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:03Z","lastTransitionTime":"2025-09-30T07:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.055744 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.055808 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.055827 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.055849 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.055867 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:04Z","lastTransitionTime":"2025-09-30T07:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.066132 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.066132 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:35:04 crc kubenswrapper[4760]: E0930 07:35:04.066298 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:35:04 crc kubenswrapper[4760]: E0930 07:35:04.066436 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.158628 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.158706 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.158729 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.158759 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.158788 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:04Z","lastTransitionTime":"2025-09-30T07:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.263252 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.263381 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.263400 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.263428 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.263442 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:04Z","lastTransitionTime":"2025-09-30T07:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.366107 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.366177 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.366196 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.366221 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.366240 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:04Z","lastTransitionTime":"2025-09-30T07:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.469423 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.469502 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.469534 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.469558 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.469570 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:04Z","lastTransitionTime":"2025-09-30T07:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.573182 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.573247 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.573266 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.573292 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.573347 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:04Z","lastTransitionTime":"2025-09-30T07:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.676581 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.676648 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.676665 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.676690 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.676709 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:04Z","lastTransitionTime":"2025-09-30T07:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.780043 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.780138 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.780170 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.780206 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.780234 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:04Z","lastTransitionTime":"2025-09-30T07:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.883016 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.883084 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.883102 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.883128 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.883148 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:04Z","lastTransitionTime":"2025-09-30T07:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.986590 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.987225 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.987417 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.987585 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:04 crc kubenswrapper[4760]: I0930 07:35:04.987735 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:04Z","lastTransitionTime":"2025-09-30T07:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.066391 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:35:05 crc kubenswrapper[4760]: E0930 07:35:05.067754 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.066514 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:35:05 crc kubenswrapper[4760]: E0930 07:35:05.067962 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.087922 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd70ee9ea129c67bb1a9ffac6e1d2abc903949ccd6d2d549f5d78472813dd222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:35:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.090555 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.090808 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.090839 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.090865 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.090888 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:05Z","lastTransitionTime":"2025-09-30T07:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.108349 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a9c8270-6964-4886-87d0-227b05b76da4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3cf6cd0daf5df3e48c5d44b1a443b1e5646009b9eba42bc3869ee9841e472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cctgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f2lrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:35:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.131380 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lvdpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2f3cfbeb8083c685469a0d988253e1fd9c2403954dda3cc742b87225c82927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:43Z\\\",\\\"message\\\":\\\"2025-09-30T07:33:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f7972461-b57d-4fae-b02a-2e738164e9c7\\\\n2025-09-30T07:33:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f7972461-b57d-4fae-b02a-2e738164e9c7 to /host/opt/cni/bin/\\\\n2025-09-30T07:33:58Z [verbose] multus-daemon started\\\\n2025-09-30T07:33:58Z [verbose] Readiness Indicator file check\\\\n2025-09-30T07:34:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8g8kp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lvdpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:35:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.150931 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rpvhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8530550-438d-46a5-aa3f-4b10838396f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0434ec52328c8ee0a2b28fb47833cc6cf12a7048f4c4a217518e029e616fb6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j7qm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rpvhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:35:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.185809 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9092fdd9-f142-400d-b446-47b8103b9694\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3328dfae8f6d99c152e6a53b86d72f690afbd7bc4847810f745b321e4f3ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5aaa898c76100f011c3ad0ba61c9735a940b46b0e770f3ede7fc4c881a4722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b718c4a73fd119f66dd346618976f0b3589b789576e53d367588c2772becd66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d00b9a56c23ba3b5701a31412e038c073b5f16eb4e6ebea42134f7d4c0b0f21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d03c8ff6fa9f490ec0550ac57d20786f1f5c10d91011d678116c2249a79e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed4236004989a888f1f864535bedd2dc4bcad4c51c87f12be074abbe5f8fe8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://553fa289f27580c1c38da1e9ea12ae7231fc4f60c660a9a3a177391adba9c3f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b9ee5e77d60387298ccac46ad416a5d56bfd1d81a6ef7f3ea0482520c2ca08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:35:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.192822 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.192860 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.192876 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.192896 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.192909 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:05Z","lastTransitionTime":"2025-09-30T07:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.212701 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031672b4-4779-4226-97aa-f5c81a729234\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3e26d92b5602604d33e83e83e677084c642df64c9d0b9f9e022b036091fb92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53390cb3cc86d0db1019c461bf6cb987190e6fa8812f735844028dfa1ce96648\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf87f93832356c96c1c5e90e6459ccdd23700fd12c6d7410305c66f99f6ed18f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4230b2df5bf87138f5a40f146bef562ae63ec6f5b73d023e4e2b60722854882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c416bd4523f4384b4a2283eb2ada263982380925cbc17713726e0790e5fdee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd9c18868b27fcfb67186a86f415b484f009c33c07d18f5d508155ed011b5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:35:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.230532 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67985d3d-0af6-4a8b-b45a-623aed2e502e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b97af91f474a3b43f207d377ca205e459b782e552a32a134a8b46e89fc8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d748511816054d5f621bd55e203baea2e6e5ac90dec796364b46224308cfd78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1b2ce86bf47ce917a20b48f64b7c4419e3a5c551d006d97171311c3b190f0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c9b2cd9f096d99a8b239c0c2dc771e94e6522a3380733b7452969c9fe2dfd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:35:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.251594 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ac421ed3707970914cf165f54a9324134079296d1b6936332c6047b9c23132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:35:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.266613 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:35:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.281830 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:35:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.296254 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.296353 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.296372 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.296396 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.296413 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:05Z","lastTransitionTime":"2025-09-30T07:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.306745 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aade7c8e-aa34-4b19-9000-d724950a70d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c7192e617554ba6b5b9533792f166344fbf9681b63e57c180809e721cc18168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9c7c74e636ef9dbade2541863f96b1098f7ae5f1ab02740c8934f8c87c4815e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abb316d62bc2246ace40ad59b2926962a1c098c5690adea8c20a5b96459a604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96c45841896c0c1569ad3071a3114e8c4965cc3968244c91719bcf5df92e5d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2220320618b4dcdbf56bad96dbe01c910fa37948f97792f498b3336212cc7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980d5ebcac5351bfa8c26b90968593f3fae3d434e824db5a182449ef64b0a13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5912656a198607c9831da9564b5507d662eb2df00df104d52cf058442a76be6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df7r7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6vfjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:35:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.325603 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943219e9-5457-4767-b29c-cdd155ca3cb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9171c7e60156e3ffabb5954ff097de7c4cb0629967cdddb44f16bf86ae38c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f75ced5ba38de06feef6d8addd42a76b6b14cab6a67a351432fdb580eb0ca3c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ead42377ef36b0c38401509b29432dd9b6fdf73cc39f2dbea0930415492a4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f5de75ed9124ec7a6cd0d20862001c06327d05876de38fdbb1270a9e9e8df54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f5de75ed9124ec7a6cd0d20862001c06327d05876de38fdbb1270a9e9e8df54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:35:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.361854 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4ca8ea-a714-40e5-9e10-080aef32237b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T07:34:49Z\\\",\\\"message\\\":\\\"rvice.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0074eb62b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:8443,TargetPort:{1 0 metrics},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: package-server-manager,},ClusterIP:10.217.4.110,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.110],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0930 07:34:49.085075 6715 services_controller.go:454] Service openshift-marketplace/marketplace-operator-metrics for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0930 07:34:49.085082 6715 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T07:34:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sspvl_openshift-ovn-kubernetes(2c4ca8ea-a714-40e5-9e10-080aef32237b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T07:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lllfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sspvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:35:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.383960 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5ae239e08e07bda9532c0b6b94ef3687cf512cd05965e09123bbb036f915a5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3abb25619d0c80e84a2095aba45fe6a865125d242e44e9f10820e8c41a3b0db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:35:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.399412 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.399484 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.399504 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.399531 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.399551 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:05Z","lastTransitionTime":"2025-09-30T07:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.404003 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:35:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.422290 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sv6wk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8035f8-210d-4a09-bca5-274ced93774c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:33:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d38a9cedeb69a2093a9de05db794e856295535f86d73a75c6feea05b61cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pk8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:33:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sv6wk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:35:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.441927 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b6201d8-80fd-4701-a4a8-f7ebca1f34ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a33e356ac5e20ea894804e56d62f6220b8b9a5123d1b0acc9bbe33a3083792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://722e623d70449c90fb388b89511f1e75ed55015ec9caa45d9aaa9ac3d9649778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T07:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lqpj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:35:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.460122 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wv8fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T07:34:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcwbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T07:34:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wv8fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T07:35:05Z is after 2025-08-24T17:21:41Z" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.502968 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.503037 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.503082 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.503115 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.503138 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:05Z","lastTransitionTime":"2025-09-30T07:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.606092 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.606150 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.606166 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.606191 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.606208 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:05Z","lastTransitionTime":"2025-09-30T07:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.709297 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.709412 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.709436 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.709468 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.709488 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:05Z","lastTransitionTime":"2025-09-30T07:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.812870 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.813006 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.813035 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.813077 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.813108 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:05Z","lastTransitionTime":"2025-09-30T07:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.916826 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.916891 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.916910 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.916939 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:05 crc kubenswrapper[4760]: I0930 07:35:05.916962 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:05Z","lastTransitionTime":"2025-09-30T07:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.021451 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.021511 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.021524 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.021543 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.021559 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:06Z","lastTransitionTime":"2025-09-30T07:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.065925 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.066002 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:35:06 crc kubenswrapper[4760]: E0930 07:35:06.066090 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:35:06 crc kubenswrapper[4760]: E0930 07:35:06.066678 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.124967 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.125031 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.125052 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.125078 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.125096 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:06Z","lastTransitionTime":"2025-09-30T07:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.227844 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.227881 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.227908 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.227924 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.227934 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:06Z","lastTransitionTime":"2025-09-30T07:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.331119 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.331195 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.331213 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.331238 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.331254 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:06Z","lastTransitionTime":"2025-09-30T07:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.435081 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.435149 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.435174 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.435209 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.435233 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:06Z","lastTransitionTime":"2025-09-30T07:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.538197 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.538267 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.538285 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.538334 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.538358 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:06Z","lastTransitionTime":"2025-09-30T07:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.641476 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.641550 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.641570 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.641598 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.641616 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:06Z","lastTransitionTime":"2025-09-30T07:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.744230 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.744337 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.744364 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.744398 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.744422 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:06Z","lastTransitionTime":"2025-09-30T07:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.847552 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.847615 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.847632 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.847659 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.847677 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:06Z","lastTransitionTime":"2025-09-30T07:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.952355 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.952435 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.952454 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.952482 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:06 crc kubenswrapper[4760]: I0930 07:35:06.952503 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:06Z","lastTransitionTime":"2025-09-30T07:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.055827 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.055882 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.055895 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.055914 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.055930 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:07Z","lastTransitionTime":"2025-09-30T07:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.066257 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.066445 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:35:07 crc kubenswrapper[4760]: E0930 07:35:07.066624 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:35:07 crc kubenswrapper[4760]: E0930 07:35:07.066788 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.159566 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.159614 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.159626 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.159644 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.159656 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:07Z","lastTransitionTime":"2025-09-30T07:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.262569 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.262621 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.262638 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.262660 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.262678 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:07Z","lastTransitionTime":"2025-09-30T07:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.364804 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.364853 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.364862 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.364877 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.364887 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:07Z","lastTransitionTime":"2025-09-30T07:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.467593 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.467638 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.467651 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.467669 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.467683 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:07Z","lastTransitionTime":"2025-09-30T07:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.571100 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.571149 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.571161 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.571180 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.571193 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:07Z","lastTransitionTime":"2025-09-30T07:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.673761 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.673821 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.673834 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.673854 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.673868 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:07Z","lastTransitionTime":"2025-09-30T07:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.777657 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.777731 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.777749 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.777772 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.777790 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:07Z","lastTransitionTime":"2025-09-30T07:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.880775 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.880828 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.880842 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.880859 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.880872 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:07Z","lastTransitionTime":"2025-09-30T07:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.983596 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.983673 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.983702 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.983732 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:07 crc kubenswrapper[4760]: I0930 07:35:07.983849 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:07Z","lastTransitionTime":"2025-09-30T07:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.066628 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.066906 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:35:08 crc kubenswrapper[4760]: E0930 07:35:08.066984 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:35:08 crc kubenswrapper[4760]: E0930 07:35:08.067087 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.087152 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.087230 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.087249 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.087275 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.087295 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:08Z","lastTransitionTime":"2025-09-30T07:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.190351 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.190381 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.190389 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.190404 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.190413 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:08Z","lastTransitionTime":"2025-09-30T07:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.293198 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.293281 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.293340 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.293367 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.293387 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:08Z","lastTransitionTime":"2025-09-30T07:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.396762 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.396840 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.396864 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.396894 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.396916 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:08Z","lastTransitionTime":"2025-09-30T07:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.500577 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.500646 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.500665 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.500690 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.500708 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:08Z","lastTransitionTime":"2025-09-30T07:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.604019 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.604084 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.604101 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.604132 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.604151 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:08Z","lastTransitionTime":"2025-09-30T07:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.707894 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.707967 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.707992 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.708020 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.708039 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:08Z","lastTransitionTime":"2025-09-30T07:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.810516 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.810576 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.810595 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.810618 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.810638 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:08Z","lastTransitionTime":"2025-09-30T07:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.914067 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.914142 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.914160 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.914188 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:08 crc kubenswrapper[4760]: I0930 07:35:08.914206 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:08Z","lastTransitionTime":"2025-09-30T07:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.016830 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.016887 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.016904 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.016930 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.016948 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:09Z","lastTransitionTime":"2025-09-30T07:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.066629 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.067439 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:35:09 crc kubenswrapper[4760]: E0930 07:35:09.067596 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:35:09 crc kubenswrapper[4760]: E0930 07:35:09.067865 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.081689 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.120242 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.120289 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.120336 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.120360 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.120378 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:09Z","lastTransitionTime":"2025-09-30T07:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.223703 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.223765 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.223783 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.223808 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.223827 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:09Z","lastTransitionTime":"2025-09-30T07:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.327477 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.327547 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.327566 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.327616 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.327633 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:09Z","lastTransitionTime":"2025-09-30T07:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.430822 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.430883 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.430901 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.430952 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.430975 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:09Z","lastTransitionTime":"2025-09-30T07:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.533208 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.533273 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.533290 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.533348 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.533366 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:09Z","lastTransitionTime":"2025-09-30T07:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.636546 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.636614 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.636637 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.636667 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.636689 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:09Z","lastTransitionTime":"2025-09-30T07:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.739365 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.739426 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.739444 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.739467 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.739485 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:09Z","lastTransitionTime":"2025-09-30T07:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.842409 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.842465 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.842484 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.842507 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.842523 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:09Z","lastTransitionTime":"2025-09-30T07:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.947987 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.948057 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.948076 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.948102 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:09 crc kubenswrapper[4760]: I0930 07:35:09.948120 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:09Z","lastTransitionTime":"2025-09-30T07:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.051449 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.051514 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.051531 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.051558 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.051577 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:10Z","lastTransitionTime":"2025-09-30T07:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.066943 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.066944 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:35:10 crc kubenswrapper[4760]: E0930 07:35:10.067162 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:35:10 crc kubenswrapper[4760]: E0930 07:35:10.067377 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.155012 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.155079 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.155096 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.155123 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.155142 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:10Z","lastTransitionTime":"2025-09-30T07:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.258812 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.258882 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.258901 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.258929 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.258951 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:10Z","lastTransitionTime":"2025-09-30T07:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.362332 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.362394 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.362411 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.362436 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.362460 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:10Z","lastTransitionTime":"2025-09-30T07:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.468245 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.468350 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.468365 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.468409 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.468423 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:10Z","lastTransitionTime":"2025-09-30T07:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.570464 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.570515 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.570526 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.570545 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.570557 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:10Z","lastTransitionTime":"2025-09-30T07:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.672692 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.672758 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.672776 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.672801 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.672817 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:10Z","lastTransitionTime":"2025-09-30T07:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.776278 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.776409 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.776434 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.776467 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.776492 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:10Z","lastTransitionTime":"2025-09-30T07:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.879212 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.879256 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.879269 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.879288 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.879327 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:10Z","lastTransitionTime":"2025-09-30T07:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.983110 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.983167 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.983194 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.983241 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:10 crc kubenswrapper[4760]: I0930 07:35:10.983264 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:10Z","lastTransitionTime":"2025-09-30T07:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.066957 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.066962 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:35:11 crc kubenswrapper[4760]: E0930 07:35:11.067427 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:35:11 crc kubenswrapper[4760]: E0930 07:35:11.067517 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.085941 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.086007 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.086030 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.086062 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.086089 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:11Z","lastTransitionTime":"2025-09-30T07:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.189369 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.189428 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.189445 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.189470 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.189487 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:11Z","lastTransitionTime":"2025-09-30T07:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.292534 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.292605 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.292629 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.292660 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.292687 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:11Z","lastTransitionTime":"2025-09-30T07:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.395662 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.395727 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.395744 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.395769 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.395792 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:11Z","lastTransitionTime":"2025-09-30T07:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.499433 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.499501 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.499538 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.499565 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.499583 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:11Z","lastTransitionTime":"2025-09-30T07:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.602182 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.602247 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.602264 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.602290 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.602348 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:11Z","lastTransitionTime":"2025-09-30T07:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.704926 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.704990 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.705014 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.705040 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.705060 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:11Z","lastTransitionTime":"2025-09-30T07:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.808792 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.808855 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.808873 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.808898 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.808916 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:11Z","lastTransitionTime":"2025-09-30T07:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.912065 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.912122 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.912136 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.912152 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:11 crc kubenswrapper[4760]: I0930 07:35:11.912161 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:11Z","lastTransitionTime":"2025-09-30T07:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.014929 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.015034 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.015058 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.015086 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.015106 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:12Z","lastTransitionTime":"2025-09-30T07:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.066918 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.067023 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:35:12 crc kubenswrapper[4760]: E0930 07:35:12.067164 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:35:12 crc kubenswrapper[4760]: E0930 07:35:12.067725 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.119141 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.119220 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.119323 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.119351 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.119374 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:12Z","lastTransitionTime":"2025-09-30T07:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.223694 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.223757 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.223774 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.223814 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.223825 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:12Z","lastTransitionTime":"2025-09-30T07:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.334525 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.334608 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.334629 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.334657 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.334679 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:12Z","lastTransitionTime":"2025-09-30T07:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.438446 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.438524 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.438544 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.438572 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.438593 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:12Z","lastTransitionTime":"2025-09-30T07:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.541862 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.541915 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.541933 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.541960 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.541980 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:12Z","lastTransitionTime":"2025-09-30T07:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.645078 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.645171 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.645188 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.645212 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.645229 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:12Z","lastTransitionTime":"2025-09-30T07:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.748698 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.748817 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.748837 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.748862 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.748881 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:12Z","lastTransitionTime":"2025-09-30T07:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.851988 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.852050 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.852067 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.852090 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.852107 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:12Z","lastTransitionTime":"2025-09-30T07:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.955920 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.955971 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.955988 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.956012 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:12 crc kubenswrapper[4760]: I0930 07:35:12.956030 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:12Z","lastTransitionTime":"2025-09-30T07:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.058990 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.059062 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.059084 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.059115 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.059135 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:13Z","lastTransitionTime":"2025-09-30T07:35:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.066400 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.066564 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:35:13 crc kubenswrapper[4760]: E0930 07:35:13.066643 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:35:13 crc kubenswrapper[4760]: E0930 07:35:13.066747 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.162975 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.163052 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.163070 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.163097 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.163117 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:13Z","lastTransitionTime":"2025-09-30T07:35:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.265795 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.265850 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.265862 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.265879 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.265889 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:13Z","lastTransitionTime":"2025-09-30T07:35:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.369000 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.369045 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.369056 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.369077 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.369088 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:13Z","lastTransitionTime":"2025-09-30T07:35:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.472988 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.473110 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.473174 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.473204 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.473222 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:13Z","lastTransitionTime":"2025-09-30T07:35:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.577296 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.577470 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.577499 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.577561 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.577592 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:13Z","lastTransitionTime":"2025-09-30T07:35:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.681264 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.681405 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.681436 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.681468 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.681489 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:13Z","lastTransitionTime":"2025-09-30T07:35:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.752351 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.752430 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.752448 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.752474 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.752492 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T07:35:13Z","lastTransitionTime":"2025-09-30T07:35:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.831241 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-969fx"] Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.832161 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-969fx" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.835266 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.838117 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.838962 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.839408 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.881926 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podStartSLOduration=78.881892515 podStartE2EDuration="1m18.881892515s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:13.881731281 +0000 UTC m=+99.524637753" watchObservedRunningTime="2025-09-30 07:35:13.881892515 +0000 UTC m=+99.524798967" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.915013 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/08c7964c-4fdf-4342-90db-c6e127b1ddbe-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-969fx\" (UID: \"08c7964c-4fdf-4342-90db-c6e127b1ddbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-969fx" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.915080 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/08c7964c-4fdf-4342-90db-c6e127b1ddbe-service-ca\") pod \"cluster-version-operator-5c965bbfc6-969fx\" (UID: \"08c7964c-4fdf-4342-90db-c6e127b1ddbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-969fx" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.915139 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08c7964c-4fdf-4342-90db-c6e127b1ddbe-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-969fx\" (UID: \"08c7964c-4fdf-4342-90db-c6e127b1ddbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-969fx" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.915176 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/08c7964c-4fdf-4342-90db-c6e127b1ddbe-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-969fx\" (UID: \"08c7964c-4fdf-4342-90db-c6e127b1ddbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-969fx" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.915208 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08c7964c-4fdf-4342-90db-c6e127b1ddbe-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-969fx\" (UID: \"08c7964c-4fdf-4342-90db-c6e127b1ddbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-969fx" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.929779 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-lvdpk" podStartSLOduration=78.929736184 podStartE2EDuration="1m18.929736184s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:13.911443064 +0000 UTC m=+99.554349506" watchObservedRunningTime="2025-09-30 07:35:13.929736184 +0000 UTC m=+99.572642636" Sep 30 07:35:13 crc kubenswrapper[4760]: I0930 07:35:13.930551 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rpvhp" podStartSLOduration=79.930535965 podStartE2EDuration="1m19.930535965s" podCreationTimestamp="2025-09-30 07:33:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:13.930146605 +0000 UTC m=+99.573053057" watchObservedRunningTime="2025-09-30 07:35:13.930535965 +0000 UTC m=+99.573442407" Sep 30 07:35:14 crc kubenswrapper[4760]: I0930 07:35:14.016360 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/08c7964c-4fdf-4342-90db-c6e127b1ddbe-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-969fx\" (UID: \"08c7964c-4fdf-4342-90db-c6e127b1ddbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-969fx" Sep 30 07:35:14 crc kubenswrapper[4760]: I0930 07:35:14.016444 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/08c7964c-4fdf-4342-90db-c6e127b1ddbe-service-ca\") pod \"cluster-version-operator-5c965bbfc6-969fx\" (UID: \"08c7964c-4fdf-4342-90db-c6e127b1ddbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-969fx" Sep 30 07:35:14 crc kubenswrapper[4760]: I0930 07:35:14.016514 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/08c7964c-4fdf-4342-90db-c6e127b1ddbe-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-969fx\" (UID: \"08c7964c-4fdf-4342-90db-c6e127b1ddbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-969fx" Sep 30 07:35:14 crc kubenswrapper[4760]: I0930 07:35:14.016542 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/08c7964c-4fdf-4342-90db-c6e127b1ddbe-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-969fx\" (UID: \"08c7964c-4fdf-4342-90db-c6e127b1ddbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-969fx" Sep 30 07:35:14 crc kubenswrapper[4760]: I0930 07:35:14.016550 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08c7964c-4fdf-4342-90db-c6e127b1ddbe-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-969fx\" (UID: \"08c7964c-4fdf-4342-90db-c6e127b1ddbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-969fx" Sep 30 07:35:14 crc kubenswrapper[4760]: I0930 07:35:14.016659 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/08c7964c-4fdf-4342-90db-c6e127b1ddbe-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-969fx\" (UID: \"08c7964c-4fdf-4342-90db-c6e127b1ddbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-969fx" Sep 30 07:35:14 crc kubenswrapper[4760]: I0930 07:35:14.016727 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08c7964c-4fdf-4342-90db-c6e127b1ddbe-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-969fx\" (UID: \"08c7964c-4fdf-4342-90db-c6e127b1ddbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-969fx" Sep 30 07:35:14 crc kubenswrapper[4760]: I0930 07:35:14.017452 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/08c7964c-4fdf-4342-90db-c6e127b1ddbe-service-ca\") pod \"cluster-version-operator-5c965bbfc6-969fx\" (UID: \"08c7964c-4fdf-4342-90db-c6e127b1ddbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-969fx" Sep 30 07:35:14 crc kubenswrapper[4760]: I0930 07:35:14.027280 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08c7964c-4fdf-4342-90db-c6e127b1ddbe-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-969fx\" (UID: \"08c7964c-4fdf-4342-90db-c6e127b1ddbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-969fx" Sep 30 07:35:14 crc kubenswrapper[4760]: I0930 07:35:14.043151 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08c7964c-4fdf-4342-90db-c6e127b1ddbe-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-969fx\" (UID: \"08c7964c-4fdf-4342-90db-c6e127b1ddbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-969fx" Sep 30 07:35:14 crc kubenswrapper[4760]: I0930 07:35:14.053479 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6vfjl" podStartSLOduration=79.053461763 podStartE2EDuration="1m19.053461763s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:13.975944882 +0000 UTC m=+99.618851334" watchObservedRunningTime="2025-09-30 07:35:14.053461763 +0000 UTC m=+99.696368175" Sep 30 07:35:14 crc kubenswrapper[4760]: I0930 07:35:14.053647 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=79.053642138 podStartE2EDuration="1m19.053642138s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:14.052587981 +0000 UTC m=+99.695494403" watchObservedRunningTime="2025-09-30 07:35:14.053642138 +0000 UTC m=+99.696548670" Sep 30 07:35:14 crc kubenswrapper[4760]: I0930 07:35:14.065980 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:35:14 crc kubenswrapper[4760]: I0930 07:35:14.066090 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:35:14 crc kubenswrapper[4760]: E0930 07:35:14.066178 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:35:14 crc kubenswrapper[4760]: E0930 07:35:14.066283 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:35:14 crc kubenswrapper[4760]: I0930 07:35:14.097102 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=81.097077174 podStartE2EDuration="1m21.097077174s" podCreationTimestamp="2025-09-30 07:33:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:14.080412876 +0000 UTC m=+99.723319318" watchObservedRunningTime="2025-09-30 07:35:14.097077174 +0000 UTC m=+99.739983586" Sep 30 07:35:14 crc kubenswrapper[4760]: I0930 07:35:14.110671 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=80.110648563 podStartE2EDuration="1m20.110648563s" podCreationTimestamp="2025-09-30 07:33:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:14.09690709 +0000 UTC m=+99.739813522" watchObservedRunningTime="2025-09-30 07:35:14.110648563 +0000 UTC m=+99.753554985" Sep 30 07:35:14 crc kubenswrapper[4760]: I0930 07:35:14.162808 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-969fx" Sep 30 07:35:14 crc kubenswrapper[4760]: I0930 07:35:14.171135 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=49.171114897 podStartE2EDuration="49.171114897s" podCreationTimestamp="2025-09-30 07:34:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:14.1708627 +0000 UTC m=+99.813769122" watchObservedRunningTime="2025-09-30 07:35:14.171114897 +0000 UTC m=+99.814021309" Sep 30 07:35:14 crc kubenswrapper[4760]: I0930 07:35:14.249108 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=5.24908154 podStartE2EDuration="5.24908154s" podCreationTimestamp="2025-09-30 07:35:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:14.232690649 +0000 UTC m=+99.875597071" watchObservedRunningTime="2025-09-30 07:35:14.24908154 +0000 UTC m=+99.891987972" Sep 30 07:35:14 crc kubenswrapper[4760]: I0930 07:35:14.279687 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-sv6wk" podStartSLOduration=80.279656776 podStartE2EDuration="1m20.279656776s" podCreationTimestamp="2025-09-30 07:33:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:14.279466841 +0000 UTC m=+99.922373273" watchObservedRunningTime="2025-09-30 07:35:14.279656776 +0000 UTC m=+99.922563228" Sep 30 07:35:14 crc kubenswrapper[4760]: I0930 07:35:14.296659 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lqpj4" podStartSLOduration=79.296626802 podStartE2EDuration="1m19.296626802s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:14.295093302 +0000 UTC m=+99.937999724" watchObservedRunningTime="2025-09-30 07:35:14.296626802 +0000 UTC m=+99.939533214" Sep 30 07:35:14 crc kubenswrapper[4760]: I0930 07:35:14.686641 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-969fx" event={"ID":"08c7964c-4fdf-4342-90db-c6e127b1ddbe","Type":"ContainerStarted","Data":"bdb0a9fe063d4046d6530c6bfa18be795cf6cd90b17c42096e753c6b71818f65"} Sep 30 07:35:14 crc kubenswrapper[4760]: I0930 07:35:14.686717 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-969fx" event={"ID":"08c7964c-4fdf-4342-90db-c6e127b1ddbe","Type":"ContainerStarted","Data":"4d4cc2517c71676428b2bd64fbb43612b6b3fbd85cd7e05184b226ad0b4ba4ea"} Sep 30 07:35:14 crc kubenswrapper[4760]: I0930 07:35:14.710439 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-969fx" podStartSLOduration=79.710402953 podStartE2EDuration="1m19.710402953s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:14.708245418 +0000 UTC m=+100.351151870" watchObservedRunningTime="2025-09-30 07:35:14.710402953 +0000 UTC m=+100.353309415" Sep 30 07:35:14 crc kubenswrapper[4760]: I0930 07:35:14.927197 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-metrics-certs\") pod \"network-metrics-daemon-wv8fz\" (UID: \"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\") " pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:35:14 crc kubenswrapper[4760]: E0930 07:35:14.927611 4760 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 07:35:14 crc kubenswrapper[4760]: E0930 07:35:14.928022 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-metrics-certs podName:ce6dcf25-c8ea-450b-9fc6-9f8aeafde757 nodeName:}" failed. No retries permitted until 2025-09-30 07:36:18.92780412 +0000 UTC m=+164.570710562 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-metrics-certs") pod "network-metrics-daemon-wv8fz" (UID: "ce6dcf25-c8ea-450b-9fc6-9f8aeafde757") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 07:35:15 crc kubenswrapper[4760]: I0930 07:35:15.066619 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:35:15 crc kubenswrapper[4760]: I0930 07:35:15.066663 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:35:15 crc kubenswrapper[4760]: E0930 07:35:15.069267 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:35:15 crc kubenswrapper[4760]: E0930 07:35:15.069610 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:35:16 crc kubenswrapper[4760]: I0930 07:35:16.066549 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:35:16 crc kubenswrapper[4760]: I0930 07:35:16.066650 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:35:16 crc kubenswrapper[4760]: E0930 07:35:16.067711 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:35:16 crc kubenswrapper[4760]: E0930 07:35:16.067773 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:35:16 crc kubenswrapper[4760]: I0930 07:35:16.067957 4760 scope.go:117] "RemoveContainer" containerID="9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912" Sep 30 07:35:16 crc kubenswrapper[4760]: E0930 07:35:16.068200 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sspvl_openshift-ovn-kubernetes(2c4ca8ea-a714-40e5-9e10-080aef32237b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" Sep 30 07:35:17 crc kubenswrapper[4760]: I0930 07:35:17.066713 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:35:17 crc kubenswrapper[4760]: I0930 07:35:17.066890 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:35:17 crc kubenswrapper[4760]: E0930 07:35:17.066969 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:35:17 crc kubenswrapper[4760]: E0930 07:35:17.067113 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:35:18 crc kubenswrapper[4760]: I0930 07:35:18.066254 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:35:18 crc kubenswrapper[4760]: I0930 07:35:18.066588 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:35:18 crc kubenswrapper[4760]: E0930 07:35:18.066828 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:35:18 crc kubenswrapper[4760]: E0930 07:35:18.067246 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:35:19 crc kubenswrapper[4760]: I0930 07:35:19.065934 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:35:19 crc kubenswrapper[4760]: I0930 07:35:19.066018 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:35:19 crc kubenswrapper[4760]: E0930 07:35:19.066159 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:35:19 crc kubenswrapper[4760]: E0930 07:35:19.066525 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:35:20 crc kubenswrapper[4760]: I0930 07:35:20.066597 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:35:20 crc kubenswrapper[4760]: I0930 07:35:20.066639 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:35:20 crc kubenswrapper[4760]: E0930 07:35:20.067195 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:35:20 crc kubenswrapper[4760]: E0930 07:35:20.067381 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:35:21 crc kubenswrapper[4760]: I0930 07:35:21.066128 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:35:21 crc kubenswrapper[4760]: I0930 07:35:21.066188 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:35:21 crc kubenswrapper[4760]: E0930 07:35:21.066353 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:35:21 crc kubenswrapper[4760]: E0930 07:35:21.066488 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:35:22 crc kubenswrapper[4760]: I0930 07:35:22.066213 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:35:22 crc kubenswrapper[4760]: I0930 07:35:22.066355 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:35:22 crc kubenswrapper[4760]: E0930 07:35:22.066441 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:35:22 crc kubenswrapper[4760]: E0930 07:35:22.066598 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:35:23 crc kubenswrapper[4760]: I0930 07:35:23.066932 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:35:23 crc kubenswrapper[4760]: I0930 07:35:23.067419 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:35:23 crc kubenswrapper[4760]: E0930 07:35:23.067834 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:35:23 crc kubenswrapper[4760]: E0930 07:35:23.067918 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:35:24 crc kubenswrapper[4760]: I0930 07:35:24.066120 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:35:24 crc kubenswrapper[4760]: I0930 07:35:24.066164 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:35:24 crc kubenswrapper[4760]: E0930 07:35:24.066415 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:35:24 crc kubenswrapper[4760]: E0930 07:35:24.066528 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:35:25 crc kubenswrapper[4760]: I0930 07:35:25.066610 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:35:25 crc kubenswrapper[4760]: I0930 07:35:25.066670 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:35:25 crc kubenswrapper[4760]: E0930 07:35:25.068391 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:35:25 crc kubenswrapper[4760]: E0930 07:35:25.068559 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:35:26 crc kubenswrapper[4760]: I0930 07:35:26.066821 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:35:26 crc kubenswrapper[4760]: I0930 07:35:26.066821 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:35:26 crc kubenswrapper[4760]: E0930 07:35:26.067094 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:35:26 crc kubenswrapper[4760]: E0930 07:35:26.067225 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:35:27 crc kubenswrapper[4760]: I0930 07:35:27.066292 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:35:27 crc kubenswrapper[4760]: I0930 07:35:27.066344 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:35:27 crc kubenswrapper[4760]: E0930 07:35:27.066520 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:35:27 crc kubenswrapper[4760]: E0930 07:35:27.066613 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:35:28 crc kubenswrapper[4760]: I0930 07:35:28.066387 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:35:28 crc kubenswrapper[4760]: E0930 07:35:28.066550 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:35:28 crc kubenswrapper[4760]: I0930 07:35:28.066553 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:35:28 crc kubenswrapper[4760]: E0930 07:35:28.066809 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:35:29 crc kubenswrapper[4760]: I0930 07:35:29.066656 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:35:29 crc kubenswrapper[4760]: I0930 07:35:29.066702 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:35:29 crc kubenswrapper[4760]: E0930 07:35:29.067260 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:35:29 crc kubenswrapper[4760]: E0930 07:35:29.067552 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:35:29 crc kubenswrapper[4760]: I0930 07:35:29.067736 4760 scope.go:117] "RemoveContainer" containerID="9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912" Sep 30 07:35:29 crc kubenswrapper[4760]: E0930 07:35:29.067987 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sspvl_openshift-ovn-kubernetes(2c4ca8ea-a714-40e5-9e10-080aef32237b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" Sep 30 07:35:30 crc kubenswrapper[4760]: I0930 07:35:30.066208 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:35:30 crc kubenswrapper[4760]: I0930 07:35:30.066337 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:35:30 crc kubenswrapper[4760]: E0930 07:35:30.066465 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:35:30 crc kubenswrapper[4760]: E0930 07:35:30.066702 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:35:30 crc kubenswrapper[4760]: I0930 07:35:30.754343 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lvdpk_f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e/kube-multus/1.log" Sep 30 07:35:30 crc kubenswrapper[4760]: I0930 07:35:30.755380 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lvdpk_f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e/kube-multus/0.log" Sep 30 07:35:30 crc kubenswrapper[4760]: I0930 07:35:30.755485 4760 generic.go:334] "Generic (PLEG): container finished" podID="f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e" containerID="0b2f3cfbeb8083c685469a0d988253e1fd9c2403954dda3cc742b87225c82927" exitCode=1 Sep 30 07:35:30 crc kubenswrapper[4760]: I0930 07:35:30.755546 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lvdpk" event={"ID":"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e","Type":"ContainerDied","Data":"0b2f3cfbeb8083c685469a0d988253e1fd9c2403954dda3cc742b87225c82927"} Sep 30 07:35:30 crc kubenswrapper[4760]: I0930 07:35:30.755609 4760 scope.go:117] "RemoveContainer" containerID="96fd613333f4911d54aca94a38724570ef878c3f55ef48caa79eb1c14d0b2014" Sep 30 07:35:30 crc kubenswrapper[4760]: I0930 07:35:30.757419 4760 scope.go:117] "RemoveContainer" containerID="0b2f3cfbeb8083c685469a0d988253e1fd9c2403954dda3cc742b87225c82927" Sep 30 07:35:30 crc kubenswrapper[4760]: E0930 07:35:30.758020 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-lvdpk_openshift-multus(f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e)\"" pod="openshift-multus/multus-lvdpk" podUID="f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e" Sep 30 07:35:31 crc kubenswrapper[4760]: I0930 07:35:31.065973 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:35:31 crc kubenswrapper[4760]: I0930 07:35:31.066143 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:35:31 crc kubenswrapper[4760]: E0930 07:35:31.066405 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:35:31 crc kubenswrapper[4760]: E0930 07:35:31.066782 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:35:31 crc kubenswrapper[4760]: I0930 07:35:31.761829 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lvdpk_f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e/kube-multus/1.log" Sep 30 07:35:32 crc kubenswrapper[4760]: I0930 07:35:32.066787 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:35:32 crc kubenswrapper[4760]: I0930 07:35:32.066842 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:35:32 crc kubenswrapper[4760]: E0930 07:35:32.067002 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:35:32 crc kubenswrapper[4760]: E0930 07:35:32.067125 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:35:33 crc kubenswrapper[4760]: I0930 07:35:33.066086 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:35:33 crc kubenswrapper[4760]: I0930 07:35:33.066151 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:35:33 crc kubenswrapper[4760]: E0930 07:35:33.066378 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:35:33 crc kubenswrapper[4760]: E0930 07:35:33.066554 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:35:34 crc kubenswrapper[4760]: I0930 07:35:34.066140 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:35:34 crc kubenswrapper[4760]: E0930 07:35:34.066281 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:35:34 crc kubenswrapper[4760]: I0930 07:35:34.066140 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:35:34 crc kubenswrapper[4760]: E0930 07:35:34.066389 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:35:35 crc kubenswrapper[4760]: E0930 07:35:35.055463 4760 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Sep 30 07:35:35 crc kubenswrapper[4760]: I0930 07:35:35.066770 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:35:35 crc kubenswrapper[4760]: I0930 07:35:35.066905 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:35:35 crc kubenswrapper[4760]: E0930 07:35:35.068662 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:35:35 crc kubenswrapper[4760]: E0930 07:35:35.068782 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:35:35 crc kubenswrapper[4760]: E0930 07:35:35.196356 4760 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 07:35:36 crc kubenswrapper[4760]: I0930 07:35:36.066540 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:35:36 crc kubenswrapper[4760]: I0930 07:35:36.066642 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:35:36 crc kubenswrapper[4760]: E0930 07:35:36.066747 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:35:36 crc kubenswrapper[4760]: E0930 07:35:36.066956 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:35:37 crc kubenswrapper[4760]: I0930 07:35:37.065866 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:35:37 crc kubenswrapper[4760]: E0930 07:35:37.066323 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:35:37 crc kubenswrapper[4760]: I0930 07:35:37.065930 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:35:37 crc kubenswrapper[4760]: E0930 07:35:37.066566 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:35:38 crc kubenswrapper[4760]: I0930 07:35:38.066798 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:35:38 crc kubenswrapper[4760]: I0930 07:35:38.066812 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:35:38 crc kubenswrapper[4760]: E0930 07:35:38.067007 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:35:38 crc kubenswrapper[4760]: E0930 07:35:38.067107 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:35:39 crc kubenswrapper[4760]: I0930 07:35:39.066354 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:35:39 crc kubenswrapper[4760]: I0930 07:35:39.066428 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:35:39 crc kubenswrapper[4760]: E0930 07:35:39.066665 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:35:39 crc kubenswrapper[4760]: E0930 07:35:39.066896 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:35:40 crc kubenswrapper[4760]: I0930 07:35:40.066926 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:35:40 crc kubenswrapper[4760]: I0930 07:35:40.067278 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:35:40 crc kubenswrapper[4760]: E0930 07:35:40.067387 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:35:40 crc kubenswrapper[4760]: E0930 07:35:40.067952 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:35:40 crc kubenswrapper[4760]: I0930 07:35:40.068576 4760 scope.go:117] "RemoveContainer" containerID="9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912" Sep 30 07:35:40 crc kubenswrapper[4760]: E0930 07:35:40.197889 4760 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 07:35:40 crc kubenswrapper[4760]: I0930 07:35:40.801504 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sspvl_2c4ca8ea-a714-40e5-9e10-080aef32237b/ovnkube-controller/3.log" Sep 30 07:35:40 crc kubenswrapper[4760]: I0930 07:35:40.805693 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" event={"ID":"2c4ca8ea-a714-40e5-9e10-080aef32237b","Type":"ContainerStarted","Data":"3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce"} Sep 30 07:35:40 crc kubenswrapper[4760]: I0930 07:35:40.806510 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:35:41 crc kubenswrapper[4760]: I0930 07:35:41.052868 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" podStartSLOduration=106.052832865 podStartE2EDuration="1m46.052832865s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:40.855534915 +0000 UTC m=+126.498441347" watchObservedRunningTime="2025-09-30 07:35:41.052832865 +0000 UTC m=+126.695739317" Sep 30 07:35:41 crc kubenswrapper[4760]: I0930 07:35:41.053650 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wv8fz"] Sep 30 07:35:41 crc kubenswrapper[4760]: I0930 07:35:41.053843 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:35:41 crc kubenswrapper[4760]: E0930 07:35:41.054003 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:35:41 crc kubenswrapper[4760]: I0930 07:35:41.067105 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:35:41 crc kubenswrapper[4760]: E0930 07:35:41.067387 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:35:42 crc kubenswrapper[4760]: I0930 07:35:42.065879 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:35:42 crc kubenswrapper[4760]: I0930 07:35:42.065906 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:35:42 crc kubenswrapper[4760]: E0930 07:35:42.066597 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:35:42 crc kubenswrapper[4760]: E0930 07:35:42.066732 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:35:43 crc kubenswrapper[4760]: I0930 07:35:43.066532 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:35:43 crc kubenswrapper[4760]: I0930 07:35:43.066609 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:35:43 crc kubenswrapper[4760]: E0930 07:35:43.067439 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:35:43 crc kubenswrapper[4760]: E0930 07:35:43.067604 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:35:44 crc kubenswrapper[4760]: I0930 07:35:44.066924 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:35:44 crc kubenswrapper[4760]: I0930 07:35:44.067042 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:35:44 crc kubenswrapper[4760]: E0930 07:35:44.067194 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:35:44 crc kubenswrapper[4760]: I0930 07:35:44.067278 4760 scope.go:117] "RemoveContainer" containerID="0b2f3cfbeb8083c685469a0d988253e1fd9c2403954dda3cc742b87225c82927" Sep 30 07:35:44 crc kubenswrapper[4760]: E0930 07:35:44.067753 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:35:44 crc kubenswrapper[4760]: I0930 07:35:44.831437 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lvdpk_f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e/kube-multus/1.log" Sep 30 07:35:44 crc kubenswrapper[4760]: I0930 07:35:44.831527 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lvdpk" event={"ID":"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e","Type":"ContainerStarted","Data":"2db52c47db3f1a41355726d96c0fc8510bc80589120da7e96b2b0af67aecea6a"} Sep 30 07:35:45 crc kubenswrapper[4760]: I0930 07:35:45.066006 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:35:45 crc kubenswrapper[4760]: I0930 07:35:45.066029 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:35:45 crc kubenswrapper[4760]: E0930 07:35:45.067189 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:35:45 crc kubenswrapper[4760]: E0930 07:35:45.067504 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:35:45 crc kubenswrapper[4760]: E0930 07:35:45.198515 4760 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 07:35:46 crc kubenswrapper[4760]: I0930 07:35:46.066526 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:35:46 crc kubenswrapper[4760]: I0930 07:35:46.066620 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:35:46 crc kubenswrapper[4760]: E0930 07:35:46.066782 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:35:46 crc kubenswrapper[4760]: E0930 07:35:46.066943 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:35:47 crc kubenswrapper[4760]: I0930 07:35:47.065889 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:35:47 crc kubenswrapper[4760]: I0930 07:35:47.066034 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:35:47 crc kubenswrapper[4760]: E0930 07:35:47.066538 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:35:47 crc kubenswrapper[4760]: E0930 07:35:47.066588 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:35:48 crc kubenswrapper[4760]: I0930 07:35:48.066785 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:35:48 crc kubenswrapper[4760]: I0930 07:35:48.066947 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:35:48 crc kubenswrapper[4760]: E0930 07:35:48.067026 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:35:48 crc kubenswrapper[4760]: E0930 07:35:48.067116 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:35:49 crc kubenswrapper[4760]: I0930 07:35:49.066653 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:35:49 crc kubenswrapper[4760]: I0930 07:35:49.066758 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:35:49 crc kubenswrapper[4760]: E0930 07:35:49.066869 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 07:35:49 crc kubenswrapper[4760]: E0930 07:35:49.066954 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wv8fz" podUID="ce6dcf25-c8ea-450b-9fc6-9f8aeafde757" Sep 30 07:35:50 crc kubenswrapper[4760]: I0930 07:35:50.066529 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:35:50 crc kubenswrapper[4760]: E0930 07:35:50.066695 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 07:35:50 crc kubenswrapper[4760]: I0930 07:35:50.066957 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:35:50 crc kubenswrapper[4760]: E0930 07:35:50.067045 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 07:35:51 crc kubenswrapper[4760]: I0930 07:35:51.066630 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:35:51 crc kubenswrapper[4760]: I0930 07:35:51.066693 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:35:51 crc kubenswrapper[4760]: I0930 07:35:51.069765 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Sep 30 07:35:51 crc kubenswrapper[4760]: I0930 07:35:51.070364 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Sep 30 07:35:51 crc kubenswrapper[4760]: I0930 07:35:51.070718 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Sep 30 07:35:51 crc kubenswrapper[4760]: I0930 07:35:51.071153 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Sep 30 07:35:52 crc kubenswrapper[4760]: I0930 07:35:52.065977 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:35:52 crc kubenswrapper[4760]: I0930 07:35:52.066593 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:35:52 crc kubenswrapper[4760]: I0930 07:35:52.068990 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Sep 30 07:35:52 crc kubenswrapper[4760]: I0930 07:35:52.069237 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Sep 30 07:35:53 crc kubenswrapper[4760]: I0930 07:35:53.245985 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.656670 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.705959 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bhzlk"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.707127 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.709924 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6cvdd"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.710548 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.713885 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.715447 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.715488 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.715906 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.716205 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.717024 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4lv9x"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.717831 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-4lv9x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.717849 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.719438 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.719609 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.720113 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.724695 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.725202 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.725578 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.725898 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.725947 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.726117 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.726224 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.726296 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.726406 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.728351 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.728986 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.729531 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.729798 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.730198 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.731812 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.732549 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.732614 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.732816 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.732930 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.734736 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-q5spr"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.735513 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-q5spr" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.735705 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.736150 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.736495 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.736743 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.736994 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.737139 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.737340 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.740154 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.741325 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-lgt64"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.741984 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ntj6x"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.742432 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ntj6x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.742888 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lgt64" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.751429 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.751466 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.758529 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-77ttc"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.767590 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.770321 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6cvdd"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.776542 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.776988 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.777206 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.781200 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.782777 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fb43f32-6ad4-4450-8a05-80570020d5e8-config\") pod \"machine-api-operator-5694c8668f-4lv9x\" (UID: \"2fb43f32-6ad4-4450-8a05-80570020d5e8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4lv9x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.782814 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa5b63b3-2bc6-496c-8841-471e2f43021c-serving-cert\") pod \"controller-manager-879f6c89f-6cvdd\" (UID: \"fa5b63b3-2bc6-496c-8841-471e2f43021c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.782841 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djkdl\" (UniqueName: \"kubernetes.io/projected/fa5b63b3-2bc6-496c-8841-471e2f43021c-kube-api-access-djkdl\") pod \"controller-manager-879f6c89f-6cvdd\" (UID: \"fa5b63b3-2bc6-496c-8841-471e2f43021c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.782863 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cmtf\" (UniqueName: \"kubernetes.io/projected/7c05864f-63f6-4fdf-9207-0d63dd89fc49-kube-api-access-9cmtf\") pod \"apiserver-7bbb656c7d-qc22x\" (UID: \"7c05864f-63f6-4fdf-9207-0d63dd89fc49\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.782883 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f56f\" (UniqueName: \"kubernetes.io/projected/179c4bc2-b28d-445b-98f2-aa307d57cd9f-kube-api-access-4f56f\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.782904 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/179c4bc2-b28d-445b-98f2-aa307d57cd9f-encryption-config\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.782925 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.782945 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f80e44c-0738-48d0-b6e5-783d696c5ec4-service-ca-bundle\") pod \"authentication-operator-69f744f599-q5spr\" (UID: \"3f80e44c-0738-48d0-b6e5-783d696c5ec4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5spr" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.782967 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2bcd213c-83e6-4a23-9542-727b1254b17d-machine-approver-tls\") pod \"machine-approver-56656f9798-lgt64\" (UID: \"2bcd213c-83e6-4a23-9542-727b1254b17d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lgt64" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783001 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f80e44c-0738-48d0-b6e5-783d696c5ec4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-q5spr\" (UID: \"3f80e44c-0738-48d0-b6e5-783d696c5ec4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5spr" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783051 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa5b63b3-2bc6-496c-8841-471e2f43021c-config\") pod \"controller-manager-879f6c89f-6cvdd\" (UID: \"fa5b63b3-2bc6-496c-8841-471e2f43021c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783071 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff7ef449-b6c7-4e55-886d-b66dd8327e7b-client-ca\") pod \"route-controller-manager-6576b87f9c-m8vl7\" (UID: \"ff7ef449-b6c7-4e55-886d-b66dd8327e7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783091 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f80e44c-0738-48d0-b6e5-783d696c5ec4-config\") pod \"authentication-operator-69f744f599-q5spr\" (UID: \"3f80e44c-0738-48d0-b6e5-783d696c5ec4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5spr" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783109 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/179c4bc2-b28d-445b-98f2-aa307d57cd9f-serving-cert\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783129 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783147 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f80e44c-0738-48d0-b6e5-783d696c5ec4-serving-cert\") pod \"authentication-operator-69f744f599-q5spr\" (UID: \"3f80e44c-0738-48d0-b6e5-783d696c5ec4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5spr" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783166 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7c05864f-63f6-4fdf-9207-0d63dd89fc49-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qc22x\" (UID: \"7c05864f-63f6-4fdf-9207-0d63dd89fc49\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783196 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbe4e2d5-6999-435b-b0af-d585a0ef1f5f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ntj6x\" (UID: \"cbe4e2d5-6999-435b-b0af-d585a0ef1f5f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ntj6x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783214 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/179c4bc2-b28d-445b-98f2-aa307d57cd9f-audit-dir\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783232 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783251 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/179c4bc2-b28d-445b-98f2-aa307d57cd9f-audit\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783270 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8vvk\" (UniqueName: \"kubernetes.io/projected/cbe4e2d5-6999-435b-b0af-d585a0ef1f5f-kube-api-access-c8vvk\") pod \"openshift-apiserver-operator-796bbdcf4f-ntj6x\" (UID: \"cbe4e2d5-6999-435b-b0af-d585a0ef1f5f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ntj6x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783289 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783325 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783346 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fa5b63b3-2bc6-496c-8841-471e2f43021c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6cvdd\" (UID: \"fa5b63b3-2bc6-496c-8841-471e2f43021c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783418 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7c05864f-63f6-4fdf-9207-0d63dd89fc49-etcd-client\") pod \"apiserver-7bbb656c7d-qc22x\" (UID: \"7c05864f-63f6-4fdf-9207-0d63dd89fc49\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783455 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f2f25243-2a0b-498f-8de6-8b0a21c72c49-audit-dir\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783502 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/179c4bc2-b28d-445b-98f2-aa307d57cd9f-etcd-serving-ca\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783548 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bcd213c-83e6-4a23-9542-727b1254b17d-config\") pod \"machine-approver-56656f9798-lgt64\" (UID: \"2bcd213c-83e6-4a23-9542-727b1254b17d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lgt64" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783574 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbe4e2d5-6999-435b-b0af-d585a0ef1f5f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ntj6x\" (UID: \"cbe4e2d5-6999-435b-b0af-d585a0ef1f5f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ntj6x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783609 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n62bz\" (UniqueName: \"kubernetes.io/projected/ff7ef449-b6c7-4e55-886d-b66dd8327e7b-kube-api-access-n62bz\") pod \"route-controller-manager-6576b87f9c-m8vl7\" (UID: \"ff7ef449-b6c7-4e55-886d-b66dd8327e7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783632 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/179c4bc2-b28d-445b-98f2-aa307d57cd9f-node-pullsecrets\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783662 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cw6g\" (UniqueName: \"kubernetes.io/projected/2fb43f32-6ad4-4450-8a05-80570020d5e8-kube-api-access-2cw6g\") pod \"machine-api-operator-5694c8668f-4lv9x\" (UID: \"2fb43f32-6ad4-4450-8a05-80570020d5e8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4lv9x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783689 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c05864f-63f6-4fdf-9207-0d63dd89fc49-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qc22x\" (UID: \"7c05864f-63f6-4fdf-9207-0d63dd89fc49\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783726 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f2f25243-2a0b-498f-8de6-8b0a21c72c49-audit-policies\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783749 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtgwl\" (UniqueName: \"kubernetes.io/projected/f2f25243-2a0b-498f-8de6-8b0a21c72c49-kube-api-access-xtgwl\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783783 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2bcd213c-83e6-4a23-9542-727b1254b17d-auth-proxy-config\") pod \"machine-approver-56656f9798-lgt64\" (UID: \"2bcd213c-83e6-4a23-9542-727b1254b17d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lgt64" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783806 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783837 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2fb43f32-6ad4-4450-8a05-80570020d5e8-images\") pod \"machine-api-operator-5694c8668f-4lv9x\" (UID: \"2fb43f32-6ad4-4450-8a05-80570020d5e8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4lv9x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783860 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/179c4bc2-b28d-445b-98f2-aa307d57cd9f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783881 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783899 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c05864f-63f6-4fdf-9207-0d63dd89fc49-serving-cert\") pod \"apiserver-7bbb656c7d-qc22x\" (UID: \"7c05864f-63f6-4fdf-9207-0d63dd89fc49\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783941 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c05864f-63f6-4fdf-9207-0d63dd89fc49-audit-dir\") pod \"apiserver-7bbb656c7d-qc22x\" (UID: \"7c05864f-63f6-4fdf-9207-0d63dd89fc49\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.783977 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.784014 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7c05864f-63f6-4fdf-9207-0d63dd89fc49-encryption-config\") pod \"apiserver-7bbb656c7d-qc22x\" (UID: \"7c05864f-63f6-4fdf-9207-0d63dd89fc49\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.784044 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.784073 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph5pw\" (UniqueName: \"kubernetes.io/projected/2bcd213c-83e6-4a23-9542-727b1254b17d-kube-api-access-ph5pw\") pod \"machine-approver-56656f9798-lgt64\" (UID: \"2bcd213c-83e6-4a23-9542-727b1254b17d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lgt64" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.784095 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.784124 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff7ef449-b6c7-4e55-886d-b66dd8327e7b-config\") pod \"route-controller-manager-6576b87f9c-m8vl7\" (UID: \"ff7ef449-b6c7-4e55-886d-b66dd8327e7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.784146 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/179c4bc2-b28d-445b-98f2-aa307d57cd9f-image-import-ca\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.784175 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fb43f32-6ad4-4450-8a05-80570020d5e8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4lv9x\" (UID: \"2fb43f32-6ad4-4450-8a05-80570020d5e8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4lv9x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.784206 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvmm2\" (UniqueName: \"kubernetes.io/projected/3f80e44c-0738-48d0-b6e5-783d696c5ec4-kube-api-access-mvmm2\") pod \"authentication-operator-69f744f599-q5spr\" (UID: \"3f80e44c-0738-48d0-b6e5-783d696c5ec4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5spr" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.784228 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c05864f-63f6-4fdf-9207-0d63dd89fc49-audit-policies\") pod \"apiserver-7bbb656c7d-qc22x\" (UID: \"7c05864f-63f6-4fdf-9207-0d63dd89fc49\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.784272 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa5b63b3-2bc6-496c-8841-471e2f43021c-client-ca\") pod \"controller-manager-879f6c89f-6cvdd\" (UID: \"fa5b63b3-2bc6-496c-8841-471e2f43021c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.784330 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff7ef449-b6c7-4e55-886d-b66dd8327e7b-serving-cert\") pod \"route-controller-manager-6576b87f9c-m8vl7\" (UID: \"ff7ef449-b6c7-4e55-886d-b66dd8327e7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.784352 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/179c4bc2-b28d-445b-98f2-aa307d57cd9f-config\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.784371 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/179c4bc2-b28d-445b-98f2-aa307d57cd9f-etcd-client\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.784393 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.784940 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.785174 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.785275 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.785575 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.785837 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.785987 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.786086 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.786175 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.786403 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.786617 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.786871 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.786967 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.787077 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.787381 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.787469 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.787499 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.787578 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.787608 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.787670 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.787700 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.787723 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.787815 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.787840 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.787984 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.787993 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.788147 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.794090 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.797707 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4lv9x"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.798801 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.800837 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.801013 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.801501 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.803534 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.803683 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cfnkh"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.804156 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bhzlk"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.804229 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cfnkh" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.805061 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ntj6x"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.808063 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-ntgr2"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.808540 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-p96nl"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.809007 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p96nl" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.813392 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.820437 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.820472 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tlrrq"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.820856 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tlrrq" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.822445 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.827168 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.827337 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.827913 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.828150 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.828245 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.828339 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.828440 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-75sr2"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.836766 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.837075 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.837218 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.837408 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.837532 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.837706 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.837871 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.838151 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.838267 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.839227 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.841957 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmqhx"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.842389 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-75sr2" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.843125 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmqhx" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.855760 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.855840 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.856308 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.857419 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pznb4"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.858197 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pznb4" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.858741 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.859367 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.861039 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.861059 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.861070 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.861191 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.861289 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.861361 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.861385 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.861472 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.861607 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.862102 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.864911 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.871029 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gsddw"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.872190 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gsddw" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.873106 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fwcqc"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.873817 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fwcqc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.874650 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7278d"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.875934 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-7278d" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.876416 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.876807 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-s7mth"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.884365 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s7mth" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.884440 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgs9v"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885063 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/179c4bc2-b28d-445b-98f2-aa307d57cd9f-etcd-serving-ca\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885096 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bcd213c-83e6-4a23-9542-727b1254b17d-config\") pod \"machine-approver-56656f9798-lgt64\" (UID: \"2bcd213c-83e6-4a23-9542-727b1254b17d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lgt64" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885116 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbe4e2d5-6999-435b-b0af-d585a0ef1f5f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ntj6x\" (UID: \"cbe4e2d5-6999-435b-b0af-d585a0ef1f5f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ntj6x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885123 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgs9v" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885138 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbt5m\" (UniqueName: \"kubernetes.io/projected/86638afb-4930-4496-a00d-8f243be3ab33-kube-api-access-xbt5m\") pod \"cluster-samples-operator-665b6dd947-cfnkh\" (UID: \"86638afb-4930-4496-a00d-8f243be3ab33\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cfnkh" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885159 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cc4f6325-5a2a-4f15-8ee9-860617a9d7ce-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gsddw\" (UID: \"cc4f6325-5a2a-4f15-8ee9-860617a9d7ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-gsddw" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885177 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cw6g\" (UniqueName: \"kubernetes.io/projected/2fb43f32-6ad4-4450-8a05-80570020d5e8-kube-api-access-2cw6g\") pod \"machine-api-operator-5694c8668f-4lv9x\" (UID: \"2fb43f32-6ad4-4450-8a05-80570020d5e8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4lv9x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885195 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n62bz\" (UniqueName: \"kubernetes.io/projected/ff7ef449-b6c7-4e55-886d-b66dd8327e7b-kube-api-access-n62bz\") pod \"route-controller-manager-6576b87f9c-m8vl7\" (UID: \"ff7ef449-b6c7-4e55-886d-b66dd8327e7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885216 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/179c4bc2-b28d-445b-98f2-aa307d57cd9f-node-pullsecrets\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885253 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c05864f-63f6-4fdf-9207-0d63dd89fc49-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qc22x\" (UID: \"7c05864f-63f6-4fdf-9207-0d63dd89fc49\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885277 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f2f25243-2a0b-498f-8de6-8b0a21c72c49-audit-policies\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885315 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtgwl\" (UniqueName: \"kubernetes.io/projected/f2f25243-2a0b-498f-8de6-8b0a21c72c49-kube-api-access-xtgwl\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885331 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/08d362f3-5c04-45fe-9981-ada11b028f83-console-oauth-config\") pod \"console-f9d7485db-ntgr2\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885352 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2fb43f32-6ad4-4450-8a05-80570020d5e8-images\") pod \"machine-api-operator-5694c8668f-4lv9x\" (UID: \"2fb43f32-6ad4-4450-8a05-80570020d5e8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4lv9x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885367 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2bcd213c-83e6-4a23-9542-727b1254b17d-auth-proxy-config\") pod \"machine-approver-56656f9798-lgt64\" (UID: \"2bcd213c-83e6-4a23-9542-727b1254b17d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lgt64" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885385 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885401 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/179c4bc2-b28d-445b-98f2-aa307d57cd9f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885419 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885437 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcddb\" (UniqueName: \"kubernetes.io/projected/09aeaa85-1a1d-426f-b7f2-611f67942f2c-kube-api-access-kcddb\") pod \"kube-storage-version-migrator-operator-b67b599dd-dmqhx\" (UID: \"09aeaa85-1a1d-426f-b7f2-611f67942f2c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmqhx" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885443 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/179c4bc2-b28d-445b-98f2-aa307d57cd9f-node-pullsecrets\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885458 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgch4\" (UniqueName: \"kubernetes.io/projected/cc4f6325-5a2a-4f15-8ee9-860617a9d7ce-kube-api-access-cgch4\") pod \"marketplace-operator-79b997595-gsddw\" (UID: \"cc4f6325-5a2a-4f15-8ee9-860617a9d7ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-gsddw" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885506 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c05864f-63f6-4fdf-9207-0d63dd89fc49-serving-cert\") pod \"apiserver-7bbb656c7d-qc22x\" (UID: \"7c05864f-63f6-4fdf-9207-0d63dd89fc49\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885668 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bcd213c-83e6-4a23-9542-727b1254b17d-config\") pod \"machine-approver-56656f9798-lgt64\" (UID: \"2bcd213c-83e6-4a23-9542-727b1254b17d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lgt64" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885674 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpf4d\" (UniqueName: \"kubernetes.io/projected/d0f0f394-f395-4181-a0f9-afa9b7467013-kube-api-access-vpf4d\") pod \"openshift-controller-manager-operator-756b6f6bc6-75sr2\" (UID: \"d0f0f394-f395-4181-a0f9-afa9b7467013\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-75sr2" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885697 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/179c4bc2-b28d-445b-98f2-aa307d57cd9f-etcd-serving-ca\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885717 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2611e16f-2c0b-44b4-929b-21f16b1b2e4d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-p96nl\" (UID: \"2611e16f-2c0b-44b4-929b-21f16b1b2e4d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p96nl" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885749 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c05864f-63f6-4fdf-9207-0d63dd89fc49-audit-dir\") pod \"apiserver-7bbb656c7d-qc22x\" (UID: \"7c05864f-63f6-4fdf-9207-0d63dd89fc49\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885785 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885810 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm59j\" (UniqueName: \"kubernetes.io/projected/a96a9516-5f80-4391-a1f2-f4b7531e65fa-kube-api-access-tm59j\") pod \"control-plane-machine-set-operator-78cbb6b69f-tlrrq\" (UID: \"a96a9516-5f80-4391-a1f2-f4b7531e65fa\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tlrrq" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885827 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08d362f3-5c04-45fe-9981-ada11b028f83-trusted-ca-bundle\") pod \"console-f9d7485db-ntgr2\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885843 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/08d362f3-5c04-45fe-9981-ada11b028f83-oauth-serving-cert\") pod \"console-f9d7485db-ntgr2\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885858 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0f0f394-f395-4181-a0f9-afa9b7467013-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-75sr2\" (UID: \"d0f0f394-f395-4181-a0f9-afa9b7467013\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-75sr2" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885875 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7c05864f-63f6-4fdf-9207-0d63dd89fc49-encryption-config\") pod \"apiserver-7bbb656c7d-qc22x\" (UID: \"7c05864f-63f6-4fdf-9207-0d63dd89fc49\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885931 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph5pw\" (UniqueName: \"kubernetes.io/projected/2bcd213c-83e6-4a23-9542-727b1254b17d-kube-api-access-ph5pw\") pod \"machine-approver-56656f9798-lgt64\" (UID: \"2bcd213c-83e6-4a23-9542-727b1254b17d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lgt64" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885948 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885965 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff7ef449-b6c7-4e55-886d-b66dd8327e7b-config\") pod \"route-controller-manager-6576b87f9c-m8vl7\" (UID: \"ff7ef449-b6c7-4e55-886d-b66dd8327e7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885982 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.885999 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fb43f32-6ad4-4450-8a05-80570020d5e8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4lv9x\" (UID: \"2fb43f32-6ad4-4450-8a05-80570020d5e8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4lv9x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.886015 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/179c4bc2-b28d-445b-98f2-aa307d57cd9f-image-import-ca\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.886030 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/08d362f3-5c04-45fe-9981-ada11b028f83-console-serving-cert\") pod \"console-f9d7485db-ntgr2\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.886278 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbe4e2d5-6999-435b-b0af-d585a0ef1f5f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ntj6x\" (UID: \"cbe4e2d5-6999-435b-b0af-d585a0ef1f5f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ntj6x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.886733 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f2f25243-2a0b-498f-8de6-8b0a21c72c49-audit-policies\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.887140 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c05864f-63f6-4fdf-9207-0d63dd89fc49-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qc22x\" (UID: \"7c05864f-63f6-4fdf-9207-0d63dd89fc49\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.887142 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.887184 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-ptzmt"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.887194 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/179c4bc2-b28d-445b-98f2-aa307d57cd9f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.887369 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff7ef449-b6c7-4e55-886d-b66dd8327e7b-config\") pod \"route-controller-manager-6576b87f9c-m8vl7\" (UID: \"ff7ef449-b6c7-4e55-886d-b66dd8327e7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.887385 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2bcd213c-83e6-4a23-9542-727b1254b17d-auth-proxy-config\") pod \"machine-approver-56656f9798-lgt64\" (UID: \"2bcd213c-83e6-4a23-9542-727b1254b17d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lgt64" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.887409 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/08d362f3-5c04-45fe-9981-ada11b028f83-console-config\") pod \"console-f9d7485db-ntgr2\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.887433 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7zxl\" (UniqueName: \"kubernetes.io/projected/08d362f3-5c04-45fe-9981-ada11b028f83-kube-api-access-c7zxl\") pod \"console-f9d7485db-ntgr2\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.887436 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c05864f-63f6-4fdf-9207-0d63dd89fc49-audit-dir\") pod \"apiserver-7bbb656c7d-qc22x\" (UID: \"7c05864f-63f6-4fdf-9207-0d63dd89fc49\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.887478 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvmm2\" (UniqueName: \"kubernetes.io/projected/3f80e44c-0738-48d0-b6e5-783d696c5ec4-kube-api-access-mvmm2\") pod \"authentication-operator-69f744f599-q5spr\" (UID: \"3f80e44c-0738-48d0-b6e5-783d696c5ec4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5spr" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.887563 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c05864f-63f6-4fdf-9207-0d63dd89fc49-audit-policies\") pod \"apiserver-7bbb656c7d-qc22x\" (UID: \"7c05864f-63f6-4fdf-9207-0d63dd89fc49\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.887592 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09aeaa85-1a1d-426f-b7f2-611f67942f2c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-dmqhx\" (UID: \"09aeaa85-1a1d-426f-b7f2-611f67942f2c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmqhx" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.887643 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2611e16f-2c0b-44b4-929b-21f16b1b2e4d-serving-cert\") pod \"openshift-config-operator-7777fb866f-p96nl\" (UID: \"2611e16f-2c0b-44b4-929b-21f16b1b2e4d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p96nl" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.887875 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-ptzmt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.888544 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2fb43f32-6ad4-4450-8a05-80570020d5e8-images\") pod \"machine-api-operator-5694c8668f-4lv9x\" (UID: \"2fb43f32-6ad4-4450-8a05-80570020d5e8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4lv9x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.889500 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c05864f-63f6-4fdf-9207-0d63dd89fc49-audit-policies\") pod \"apiserver-7bbb656c7d-qc22x\" (UID: \"7c05864f-63f6-4fdf-9207-0d63dd89fc49\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.889540 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa5b63b3-2bc6-496c-8841-471e2f43021c-client-ca\") pod \"controller-manager-879f6c89f-6cvdd\" (UID: \"fa5b63b3-2bc6-496c-8841-471e2f43021c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.890220 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa5b63b3-2bc6-496c-8841-471e2f43021c-client-ca\") pod \"controller-manager-879f6c89f-6cvdd\" (UID: \"fa5b63b3-2bc6-496c-8841-471e2f43021c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.890276 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8e4c5c1-b6db-4baf-8bbe-9fc39e1ed6ff-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-pznb4\" (UID: \"d8e4c5c1-b6db-4baf-8bbe-9fc39e1ed6ff\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pznb4" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.890350 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff7ef449-b6c7-4e55-886d-b66dd8327e7b-serving-cert\") pod \"route-controller-manager-6576b87f9c-m8vl7\" (UID: \"ff7ef449-b6c7-4e55-886d-b66dd8327e7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.890382 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/179c4bc2-b28d-445b-98f2-aa307d57cd9f-config\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.890399 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/179c4bc2-b28d-445b-98f2-aa307d57cd9f-etcd-client\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.890435 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.890459 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8e4c5c1-b6db-4baf-8bbe-9fc39e1ed6ff-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-pznb4\" (UID: \"d8e4c5c1-b6db-4baf-8bbe-9fc39e1ed6ff\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pznb4" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.891336 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/179c4bc2-b28d-445b-98f2-aa307d57cd9f-image-import-ca\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.891370 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.891621 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.891793 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fb43f32-6ad4-4450-8a05-80570020d5e8-config\") pod \"machine-api-operator-5694c8668f-4lv9x\" (UID: \"2fb43f32-6ad4-4450-8a05-80570020d5e8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4lv9x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.891818 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cc4f6325-5a2a-4f15-8ee9-860617a9d7ce-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gsddw\" (UID: \"cc4f6325-5a2a-4f15-8ee9-860617a9d7ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-gsddw" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.891868 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.891914 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa5b63b3-2bc6-496c-8841-471e2f43021c-serving-cert\") pod \"controller-manager-879f6c89f-6cvdd\" (UID: \"fa5b63b3-2bc6-496c-8841-471e2f43021c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.891935 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djkdl\" (UniqueName: \"kubernetes.io/projected/fa5b63b3-2bc6-496c-8841-471e2f43021c-kube-api-access-djkdl\") pod \"controller-manager-879f6c89f-6cvdd\" (UID: \"fa5b63b3-2bc6-496c-8841-471e2f43021c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.891972 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cmtf\" (UniqueName: \"kubernetes.io/projected/7c05864f-63f6-4fdf-9207-0d63dd89fc49-kube-api-access-9cmtf\") pod \"apiserver-7bbb656c7d-qc22x\" (UID: \"7c05864f-63f6-4fdf-9207-0d63dd89fc49\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.891989 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f56f\" (UniqueName: \"kubernetes.io/projected/179c4bc2-b28d-445b-98f2-aa307d57cd9f-kube-api-access-4f56f\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.892031 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/179c4bc2-b28d-445b-98f2-aa307d57cd9f-config\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.892614 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c05864f-63f6-4fdf-9207-0d63dd89fc49-serving-cert\") pod \"apiserver-7bbb656c7d-qc22x\" (UID: \"7c05864f-63f6-4fdf-9207-0d63dd89fc49\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.892758 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f80e44c-0738-48d0-b6e5-783d696c5ec4-service-ca-bundle\") pod \"authentication-operator-69f744f599-q5spr\" (UID: \"3f80e44c-0738-48d0-b6e5-783d696c5ec4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5spr" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.892788 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/179c4bc2-b28d-445b-98f2-aa307d57cd9f-encryption-config\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.892805 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.892848 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a96a9516-5f80-4391-a1f2-f4b7531e65fa-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-tlrrq\" (UID: \"a96a9516-5f80-4391-a1f2-f4b7531e65fa\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tlrrq" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.893221 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fb43f32-6ad4-4450-8a05-80570020d5e8-config\") pod \"machine-api-operator-5694c8668f-4lv9x\" (UID: \"2fb43f32-6ad4-4450-8a05-80570020d5e8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4lv9x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.894375 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2bcd213c-83e6-4a23-9542-727b1254b17d-machine-approver-tls\") pod \"machine-approver-56656f9798-lgt64\" (UID: \"2bcd213c-83e6-4a23-9542-727b1254b17d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lgt64" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.894436 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff7ef449-b6c7-4e55-886d-b66dd8327e7b-serving-cert\") pod \"route-controller-manager-6576b87f9c-m8vl7\" (UID: \"ff7ef449-b6c7-4e55-886d-b66dd8327e7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.894495 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f80e44c-0738-48d0-b6e5-783d696c5ec4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-q5spr\" (UID: \"3f80e44c-0738-48d0-b6e5-783d696c5ec4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5spr" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.894530 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0f0f394-f395-4181-a0f9-afa9b7467013-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-75sr2\" (UID: \"d0f0f394-f395-4181-a0f9-afa9b7467013\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-75sr2" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.894562 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f80e44c-0738-48d0-b6e5-783d696c5ec4-service-ca-bundle\") pod \"authentication-operator-69f744f599-q5spr\" (UID: \"3f80e44c-0738-48d0-b6e5-783d696c5ec4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5spr" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.894688 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.894775 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa5b63b3-2bc6-496c-8841-471e2f43021c-config\") pod \"controller-manager-879f6c89f-6cvdd\" (UID: \"fa5b63b3-2bc6-496c-8841-471e2f43021c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.894814 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff7ef449-b6c7-4e55-886d-b66dd8327e7b-client-ca\") pod \"route-controller-manager-6576b87f9c-m8vl7\" (UID: \"ff7ef449-b6c7-4e55-886d-b66dd8327e7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.894832 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f80e44c-0738-48d0-b6e5-783d696c5ec4-config\") pod \"authentication-operator-69f744f599-q5spr\" (UID: \"3f80e44c-0738-48d0-b6e5-783d696c5ec4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5spr" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.894869 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/179c4bc2-b28d-445b-98f2-aa307d57cd9f-serving-cert\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.894886 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.894903 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8e4c5c1-b6db-4baf-8bbe-9fc39e1ed6ff-config\") pod \"kube-controller-manager-operator-78b949d7b-pznb4\" (UID: \"d8e4c5c1-b6db-4baf-8bbe-9fc39e1ed6ff\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pznb4" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.895370 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f80e44c-0738-48d0-b6e5-783d696c5ec4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-q5spr\" (UID: \"3f80e44c-0738-48d0-b6e5-783d696c5ec4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5spr" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.895642 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f80e44c-0738-48d0-b6e5-783d696c5ec4-serving-cert\") pod \"authentication-operator-69f744f599-q5spr\" (UID: \"3f80e44c-0738-48d0-b6e5-783d696c5ec4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5spr" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.895735 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7c05864f-63f6-4fdf-9207-0d63dd89fc49-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qc22x\" (UID: \"7c05864f-63f6-4fdf-9207-0d63dd89fc49\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.895755 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbe4e2d5-6999-435b-b0af-d585a0ef1f5f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ntj6x\" (UID: \"cbe4e2d5-6999-435b-b0af-d585a0ef1f5f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ntj6x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.895815 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/179c4bc2-b28d-445b-98f2-aa307d57cd9f-audit-dir\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.895833 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.895872 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa5b63b3-2bc6-496c-8841-471e2f43021c-serving-cert\") pod \"controller-manager-879f6c89f-6cvdd\" (UID: \"fa5b63b3-2bc6-496c-8841-471e2f43021c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.895885 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f80e44c-0738-48d0-b6e5-783d696c5ec4-config\") pod \"authentication-operator-69f744f599-q5spr\" (UID: \"3f80e44c-0738-48d0-b6e5-783d696c5ec4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5spr" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.896012 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/08d362f3-5c04-45fe-9981-ada11b028f83-service-ca\") pod \"console-f9d7485db-ntgr2\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.896117 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/179c4bc2-b28d-445b-98f2-aa307d57cd9f-audit\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.896139 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fa5b63b3-2bc6-496c-8841-471e2f43021c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6cvdd\" (UID: \"fa5b63b3-2bc6-496c-8841-471e2f43021c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.896196 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8vvk\" (UniqueName: \"kubernetes.io/projected/cbe4e2d5-6999-435b-b0af-d585a0ef1f5f-kube-api-access-c8vvk\") pod \"openshift-apiserver-operator-796bbdcf4f-ntj6x\" (UID: \"cbe4e2d5-6999-435b-b0af-d585a0ef1f5f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ntj6x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.896213 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.896228 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.896271 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbkv8\" (UniqueName: \"kubernetes.io/projected/2611e16f-2c0b-44b4-929b-21f16b1b2e4d-kube-api-access-tbkv8\") pod \"openshift-config-operator-7777fb866f-p96nl\" (UID: \"2611e16f-2c0b-44b4-929b-21f16b1b2e4d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p96nl" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.896386 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7c05864f-63f6-4fdf-9207-0d63dd89fc49-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qc22x\" (UID: \"7c05864f-63f6-4fdf-9207-0d63dd89fc49\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.896421 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/179c4bc2-b28d-445b-98f2-aa307d57cd9f-audit-dir\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.896906 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa5b63b3-2bc6-496c-8841-471e2f43021c-config\") pod \"controller-manager-879f6c89f-6cvdd\" (UID: \"fa5b63b3-2bc6-496c-8841-471e2f43021c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.897195 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff7ef449-b6c7-4e55-886d-b66dd8327e7b-client-ca\") pod \"route-controller-manager-6576b87f9c-m8vl7\" (UID: \"ff7ef449-b6c7-4e55-886d-b66dd8327e7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.898523 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.898667 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fa5b63b3-2bc6-496c-8841-471e2f43021c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6cvdd\" (UID: \"fa5b63b3-2bc6-496c-8841-471e2f43021c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.898729 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7c05864f-63f6-4fdf-9207-0d63dd89fc49-etcd-client\") pod \"apiserver-7bbb656c7d-qc22x\" (UID: \"7c05864f-63f6-4fdf-9207-0d63dd89fc49\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.898786 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f2f25243-2a0b-498f-8de6-8b0a21c72c49-audit-dir\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.898809 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09aeaa85-1a1d-426f-b7f2-611f67942f2c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-dmqhx\" (UID: \"09aeaa85-1a1d-426f-b7f2-611f67942f2c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmqhx" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.898829 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/86638afb-4930-4496-a00d-8f243be3ab33-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-cfnkh\" (UID: \"86638afb-4930-4496-a00d-8f243be3ab33\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cfnkh" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.899363 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2bcd213c-83e6-4a23-9542-727b1254b17d-machine-approver-tls\") pod \"machine-approver-56656f9798-lgt64\" (UID: \"2bcd213c-83e6-4a23-9542-727b1254b17d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lgt64" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.899647 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f80e44c-0738-48d0-b6e5-783d696c5ec4-serving-cert\") pod \"authentication-operator-69f744f599-q5spr\" (UID: \"3f80e44c-0738-48d0-b6e5-783d696c5ec4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5spr" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.899659 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/179c4bc2-b28d-445b-98f2-aa307d57cd9f-etcd-client\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.899724 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f2f25243-2a0b-498f-8de6-8b0a21c72c49-audit-dir\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.899874 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.900102 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/179c4bc2-b28d-445b-98f2-aa307d57cd9f-audit\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.900827 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/179c4bc2-b28d-445b-98f2-aa307d57cd9f-encryption-config\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.901376 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7c05864f-63f6-4fdf-9207-0d63dd89fc49-etcd-client\") pod \"apiserver-7bbb656c7d-qc22x\" (UID: \"7c05864f-63f6-4fdf-9207-0d63dd89fc49\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.901724 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.902336 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbe4e2d5-6999-435b-b0af-d585a0ef1f5f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ntj6x\" (UID: \"cbe4e2d5-6999-435b-b0af-d585a0ef1f5f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ntj6x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.902508 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.902570 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.902675 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/179c4bc2-b28d-445b-98f2-aa307d57cd9f-serving-cert\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.905430 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hgcbk"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.907492 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hgcbk" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.908600 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.910145 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.913292 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pn92b"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.913790 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pn92b" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.914254 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fb43f32-6ad4-4450-8a05-80570020d5e8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4lv9x\" (UID: \"2fb43f32-6ad4-4450-8a05-80570020d5e8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4lv9x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.921730 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.924569 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-z9mdp"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.925055 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-z9mdp" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.928906 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.929295 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7c05864f-63f6-4fdf-9207-0d63dd89fc49-encryption-config\") pod \"apiserver-7bbb656c7d-qc22x\" (UID: \"7c05864f-63f6-4fdf-9207-0d63dd89fc49\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.930136 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-jx7lt"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.930568 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-jx7lt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.931275 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kkfmw"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.931907 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kkfmw" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.933882 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zwnlf"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.934501 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-zpzhj"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.934894 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-zwnlf" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.935237 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zpzhj" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.936242 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-n5tk5"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.936992 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-n5tk5" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.937360 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5lscq"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.940894 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vgrsq"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.941061 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5lscq" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.941552 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.941607 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2fn4r"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.943657 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2fn4r" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.943696 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-mqkwn"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.944051 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mqkwn" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.944981 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320290-w8n6p"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.945513 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-w8n6p" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.946145 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9gnwv"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.947264 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9gnwv" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.948441 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-vjd5w"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.948965 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-vjd5w" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.949345 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mncn2"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.950022 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mncn2" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.950624 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.951954 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-82mws"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.952893 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-82mws" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.953339 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cfnkh"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.954417 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ntgr2"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.955614 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.956271 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmqhx"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.957627 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-q5spr"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.959725 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-75sr2"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.965242 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tlrrq"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.968424 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pznb4"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.969369 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.970558 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7278d"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.971590 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gsddw"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.975790 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-z9mdp"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.975836 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-p96nl"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.975847 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pn92b"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.977125 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-s7mth"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.978333 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgs9v"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.979135 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-jx7lt"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.980855 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320290-w8n6p"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.982039 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hgcbk"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.983165 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2fn4r"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.984263 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-ptzmt"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.985376 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9gnwv"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.986436 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-77ttc"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.987436 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fwcqc"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.988795 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-82mws"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.989077 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.989838 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-zrlmg"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.990843 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-zrlmg" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.991404 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-76w2l"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.992378 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-76w2l" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.993506 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5lscq"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.994434 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kkfmw"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.995629 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mncn2"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.996829 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-mqkwn"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.997917 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-zpzhj"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.999024 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vgrsq"] Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.999461 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/08d362f3-5c04-45fe-9981-ada11b028f83-console-oauth-config\") pod \"console-f9d7485db-ntgr2\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.999487 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85f89a8e-5f37-458f-9896-fe3940cc68b6-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mncn2\" (UID: \"85f89a8e-5f37-458f-9896-fe3940cc68b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mncn2" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.999508 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcddb\" (UniqueName: \"kubernetes.io/projected/09aeaa85-1a1d-426f-b7f2-611f67942f2c-kube-api-access-kcddb\") pod \"kube-storage-version-migrator-operator-b67b599dd-dmqhx\" (UID: \"09aeaa85-1a1d-426f-b7f2-611f67942f2c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmqhx" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.999525 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgch4\" (UniqueName: \"kubernetes.io/projected/cc4f6325-5a2a-4f15-8ee9-860617a9d7ce-kube-api-access-cgch4\") pod \"marketplace-operator-79b997595-gsddw\" (UID: \"cc4f6325-5a2a-4f15-8ee9-860617a9d7ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-gsddw" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.999591 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njj95\" (UniqueName: \"kubernetes.io/projected/ea1dde7c-e014-4b78-b8e9-0fc5a0e161ac-kube-api-access-njj95\") pod \"service-ca-9c57cc56f-z9mdp\" (UID: \"ea1dde7c-e014-4b78-b8e9-0fc5a0e161ac\") " pod="openshift-service-ca/service-ca-9c57cc56f-z9mdp" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.999620 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpf4d\" (UniqueName: \"kubernetes.io/projected/d0f0f394-f395-4181-a0f9-afa9b7467013-kube-api-access-vpf4d\") pod \"openshift-controller-manager-operator-756b6f6bc6-75sr2\" (UID: \"d0f0f394-f395-4181-a0f9-afa9b7467013\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-75sr2" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.999641 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2611e16f-2c0b-44b4-929b-21f16b1b2e4d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-p96nl\" (UID: \"2611e16f-2c0b-44b4-929b-21f16b1b2e4d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p96nl" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.999659 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/26bc4724-af08-4012-9656-d1cd06b533ef-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5lscq\" (UID: \"26bc4724-af08-4012-9656-d1cd06b533ef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5lscq" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.999679 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm59j\" (UniqueName: \"kubernetes.io/projected/a96a9516-5f80-4391-a1f2-f4b7531e65fa-kube-api-access-tm59j\") pod \"control-plane-machine-set-operator-78cbb6b69f-tlrrq\" (UID: \"a96a9516-5f80-4391-a1f2-f4b7531e65fa\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tlrrq" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.999697 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08d362f3-5c04-45fe-9981-ada11b028f83-trusted-ca-bundle\") pod \"console-f9d7485db-ntgr2\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.999712 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/08d362f3-5c04-45fe-9981-ada11b028f83-oauth-serving-cert\") pod \"console-f9d7485db-ntgr2\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.999730 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0f0f394-f395-4181-a0f9-afa9b7467013-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-75sr2\" (UID: \"d0f0f394-f395-4181-a0f9-afa9b7467013\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-75sr2" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.999748 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63503165-1bff-42d3-99f4-2af2d7f490ec-config\") pod \"service-ca-operator-777779d784-mqkwn\" (UID: \"63503165-1bff-42d3-99f4-2af2d7f490ec\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mqkwn" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.999765 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85f89a8e-5f37-458f-9896-fe3940cc68b6-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mncn2\" (UID: \"85f89a8e-5f37-458f-9896-fe3940cc68b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mncn2" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.999785 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q45c6\" (UniqueName: \"kubernetes.io/projected/dd4856c0-fc17-49b2-b37b-b0414e6a2f48-kube-api-access-q45c6\") pod \"migrator-59844c95c7-s7mth\" (UID: \"dd4856c0-fc17-49b2-b37b-b0414e6a2f48\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s7mth" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.999811 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/08d362f3-5c04-45fe-9981-ada11b028f83-console-serving-cert\") pod \"console-f9d7485db-ntgr2\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.999828 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/08d362f3-5c04-45fe-9981-ada11b028f83-console-config\") pod \"console-f9d7485db-ntgr2\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.999845 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7zxl\" (UniqueName: \"kubernetes.io/projected/08d362f3-5c04-45fe-9981-ada11b028f83-kube-api-access-c7zxl\") pod \"console-f9d7485db-ntgr2\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.999877 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09aeaa85-1a1d-426f-b7f2-611f67942f2c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-dmqhx\" (UID: \"09aeaa85-1a1d-426f-b7f2-611f67942f2c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmqhx" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.999896 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2611e16f-2c0b-44b4-929b-21f16b1b2e4d-serving-cert\") pod \"openshift-config-operator-7777fb866f-p96nl\" (UID: \"2611e16f-2c0b-44b4-929b-21f16b1b2e4d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p96nl" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.999914 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63503165-1bff-42d3-99f4-2af2d7f490ec-serving-cert\") pod \"service-ca-operator-777779d784-mqkwn\" (UID: \"63503165-1bff-42d3-99f4-2af2d7f490ec\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mqkwn" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.999934 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c51dda7-332e-497f-96ed-932d5349ee59-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fwcqc\" (UID: \"4c51dda7-332e-497f-96ed-932d5349ee59\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fwcqc" Sep 30 07:35:54 crc kubenswrapper[4760]: I0930 07:35:54.999953 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8e4c5c1-b6db-4baf-8bbe-9fc39e1ed6ff-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-pznb4\" (UID: \"d8e4c5c1-b6db-4baf-8bbe-9fc39e1ed6ff\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pznb4" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:54.999968 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c51dda7-332e-497f-96ed-932d5349ee59-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fwcqc\" (UID: \"4c51dda7-332e-497f-96ed-932d5349ee59\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fwcqc" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:54.999985 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8e4c5c1-b6db-4baf-8bbe-9fc39e1ed6ff-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-pznb4\" (UID: \"d8e4c5c1-b6db-4baf-8bbe-9fc39e1ed6ff\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pznb4" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.000002 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cc4f6325-5a2a-4f15-8ee9-860617a9d7ce-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gsddw\" (UID: \"cc4f6325-5a2a-4f15-8ee9-860617a9d7ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-gsddw" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.000018 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/26bc4724-af08-4012-9656-d1cd06b533ef-srv-cert\") pod \"olm-operator-6b444d44fb-5lscq\" (UID: \"26bc4724-af08-4012-9656-d1cd06b533ef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5lscq" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.000051 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85f89a8e-5f37-458f-9896-fe3940cc68b6-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mncn2\" (UID: \"85f89a8e-5f37-458f-9896-fe3940cc68b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mncn2" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.000072 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a96a9516-5f80-4391-a1f2-f4b7531e65fa-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-tlrrq\" (UID: \"a96a9516-5f80-4391-a1f2-f4b7531e65fa\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tlrrq" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.000100 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0f0f394-f395-4181-a0f9-afa9b7467013-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-75sr2\" (UID: \"d0f0f394-f395-4181-a0f9-afa9b7467013\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-75sr2" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.000120 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8e4c5c1-b6db-4baf-8bbe-9fc39e1ed6ff-config\") pod \"kube-controller-manager-operator-78b949d7b-pznb4\" (UID: \"d8e4c5c1-b6db-4baf-8bbe-9fc39e1ed6ff\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pznb4" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.000135 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ea1dde7c-e014-4b78-b8e9-0fc5a0e161ac-signing-key\") pod \"service-ca-9c57cc56f-z9mdp\" (UID: \"ea1dde7c-e014-4b78-b8e9-0fc5a0e161ac\") " pod="openshift-service-ca/service-ca-9c57cc56f-z9mdp" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.000153 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ea1dde7c-e014-4b78-b8e9-0fc5a0e161ac-signing-cabundle\") pod \"service-ca-9c57cc56f-z9mdp\" (UID: \"ea1dde7c-e014-4b78-b8e9-0fc5a0e161ac\") " pod="openshift-service-ca/service-ca-9c57cc56f-z9mdp" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.000154 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2611e16f-2c0b-44b4-929b-21f16b1b2e4d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-p96nl\" (UID: \"2611e16f-2c0b-44b4-929b-21f16b1b2e4d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p96nl" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.000175 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/08d362f3-5c04-45fe-9981-ada11b028f83-service-ca\") pod \"console-f9d7485db-ntgr2\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.000200 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfkkd\" (UniqueName: \"kubernetes.io/projected/26bc4724-af08-4012-9656-d1cd06b533ef-kube-api-access-xfkkd\") pod \"olm-operator-6b444d44fb-5lscq\" (UID: \"26bc4724-af08-4012-9656-d1cd06b533ef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5lscq" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.000449 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-n5tk5"] Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.000687 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/08d362f3-5c04-45fe-9981-ada11b028f83-oauth-serving-cert\") pod \"console-f9d7485db-ntgr2\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.000759 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0f0f394-f395-4181-a0f9-afa9b7467013-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-75sr2\" (UID: \"d0f0f394-f395-4181-a0f9-afa9b7467013\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-75sr2" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.001376 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08d362f3-5c04-45fe-9981-ada11b028f83-trusted-ca-bundle\") pod \"console-f9d7485db-ntgr2\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.001442 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/08d362f3-5c04-45fe-9981-ada11b028f83-service-ca\") pod \"console-f9d7485db-ntgr2\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.001877 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09aeaa85-1a1d-426f-b7f2-611f67942f2c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-dmqhx\" (UID: \"09aeaa85-1a1d-426f-b7f2-611f67942f2c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmqhx" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.001959 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/08d362f3-5c04-45fe-9981-ada11b028f83-console-config\") pod \"console-f9d7485db-ntgr2\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.002014 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zwnlf"] Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.002379 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-zrlmg"] Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.002703 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/08d362f3-5c04-45fe-9981-ada11b028f83-console-oauth-config\") pod \"console-f9d7485db-ntgr2\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.002757 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8e4c5c1-b6db-4baf-8bbe-9fc39e1ed6ff-config\") pod \"kube-controller-manager-operator-78b949d7b-pznb4\" (UID: \"d8e4c5c1-b6db-4baf-8bbe-9fc39e1ed6ff\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pznb4" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.002791 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbkv8\" (UniqueName: \"kubernetes.io/projected/2611e16f-2c0b-44b4-929b-21f16b1b2e4d-kube-api-access-tbkv8\") pod \"openshift-config-operator-7777fb866f-p96nl\" (UID: \"2611e16f-2c0b-44b4-929b-21f16b1b2e4d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p96nl" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.003183 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09aeaa85-1a1d-426f-b7f2-611f67942f2c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-dmqhx\" (UID: \"09aeaa85-1a1d-426f-b7f2-611f67942f2c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmqhx" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.003249 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/86638afb-4930-4496-a00d-8f243be3ab33-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-cfnkh\" (UID: \"86638afb-4930-4496-a00d-8f243be3ab33\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cfnkh" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.003285 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ksf5\" (UniqueName: \"kubernetes.io/projected/4c51dda7-332e-497f-96ed-932d5349ee59-kube-api-access-2ksf5\") pod \"cluster-image-registry-operator-dc59b4c8b-fwcqc\" (UID: \"4c51dda7-332e-497f-96ed-932d5349ee59\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fwcqc" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.003442 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbt5m\" (UniqueName: \"kubernetes.io/projected/86638afb-4930-4496-a00d-8f243be3ab33-kube-api-access-xbt5m\") pod \"cluster-samples-operator-665b6dd947-cfnkh\" (UID: \"86638afb-4930-4496-a00d-8f243be3ab33\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cfnkh" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.003488 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cc4f6325-5a2a-4f15-8ee9-860617a9d7ce-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gsddw\" (UID: \"cc4f6325-5a2a-4f15-8ee9-860617a9d7ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-gsddw" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.003559 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/08d362f3-5c04-45fe-9981-ada11b028f83-console-serving-cert\") pod \"console-f9d7485db-ntgr2\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.003644 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7s2x\" (UniqueName: \"kubernetes.io/projected/63503165-1bff-42d3-99f4-2af2d7f490ec-kube-api-access-c7s2x\") pod \"service-ca-operator-777779d784-mqkwn\" (UID: \"63503165-1bff-42d3-99f4-2af2d7f490ec\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mqkwn" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.003700 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c51dda7-332e-497f-96ed-932d5349ee59-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fwcqc\" (UID: \"4c51dda7-332e-497f-96ed-932d5349ee59\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fwcqc" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.003905 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0f0f394-f395-4181-a0f9-afa9b7467013-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-75sr2\" (UID: \"d0f0f394-f395-4181-a0f9-afa9b7467013\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-75sr2" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.003962 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a96a9516-5f80-4391-a1f2-f4b7531e65fa-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-tlrrq\" (UID: \"a96a9516-5f80-4391-a1f2-f4b7531e65fa\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tlrrq" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.004211 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-76w2l"] Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.004344 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cc4f6325-5a2a-4f15-8ee9-860617a9d7ce-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gsddw\" (UID: \"cc4f6325-5a2a-4f15-8ee9-860617a9d7ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-gsddw" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.004569 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8e4c5c1-b6db-4baf-8bbe-9fc39e1ed6ff-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-pznb4\" (UID: \"d8e4c5c1-b6db-4baf-8bbe-9fc39e1ed6ff\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pznb4" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.005986 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2611e16f-2c0b-44b4-929b-21f16b1b2e4d-serving-cert\") pod \"openshift-config-operator-7777fb866f-p96nl\" (UID: \"2611e16f-2c0b-44b4-929b-21f16b1b2e4d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p96nl" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.006888 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-wtpqg"] Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.007905 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-wtpqg" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.013279 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09aeaa85-1a1d-426f-b7f2-611f67942f2c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-dmqhx\" (UID: \"09aeaa85-1a1d-426f-b7f2-611f67942f2c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmqhx" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.015021 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/86638afb-4930-4496-a00d-8f243be3ab33-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-cfnkh\" (UID: \"86638afb-4930-4496-a00d-8f243be3ab33\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cfnkh" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.015252 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cc4f6325-5a2a-4f15-8ee9-860617a9d7ce-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gsddw\" (UID: \"cc4f6325-5a2a-4f15-8ee9-860617a9d7ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-gsddw" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.017804 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.028898 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.050570 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.089485 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.104295 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85f89a8e-5f37-458f-9896-fe3940cc68b6-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mncn2\" (UID: \"85f89a8e-5f37-458f-9896-fe3940cc68b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mncn2" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.104410 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njj95\" (UniqueName: \"kubernetes.io/projected/ea1dde7c-e014-4b78-b8e9-0fc5a0e161ac-kube-api-access-njj95\") pod \"service-ca-9c57cc56f-z9mdp\" (UID: \"ea1dde7c-e014-4b78-b8e9-0fc5a0e161ac\") " pod="openshift-service-ca/service-ca-9c57cc56f-z9mdp" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.104438 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/26bc4724-af08-4012-9656-d1cd06b533ef-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5lscq\" (UID: \"26bc4724-af08-4012-9656-d1cd06b533ef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5lscq" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.104464 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63503165-1bff-42d3-99f4-2af2d7f490ec-config\") pod \"service-ca-operator-777779d784-mqkwn\" (UID: \"63503165-1bff-42d3-99f4-2af2d7f490ec\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mqkwn" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.104480 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85f89a8e-5f37-458f-9896-fe3940cc68b6-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mncn2\" (UID: \"85f89a8e-5f37-458f-9896-fe3940cc68b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mncn2" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.104497 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q45c6\" (UniqueName: \"kubernetes.io/projected/dd4856c0-fc17-49b2-b37b-b0414e6a2f48-kube-api-access-q45c6\") pod \"migrator-59844c95c7-s7mth\" (UID: \"dd4856c0-fc17-49b2-b37b-b0414e6a2f48\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s7mth" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.104543 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63503165-1bff-42d3-99f4-2af2d7f490ec-serving-cert\") pod \"service-ca-operator-777779d784-mqkwn\" (UID: \"63503165-1bff-42d3-99f4-2af2d7f490ec\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mqkwn" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.104560 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c51dda7-332e-497f-96ed-932d5349ee59-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fwcqc\" (UID: \"4c51dda7-332e-497f-96ed-932d5349ee59\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fwcqc" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.104581 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c51dda7-332e-497f-96ed-932d5349ee59-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fwcqc\" (UID: \"4c51dda7-332e-497f-96ed-932d5349ee59\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fwcqc" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.104599 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/26bc4724-af08-4012-9656-d1cd06b533ef-srv-cert\") pod \"olm-operator-6b444d44fb-5lscq\" (UID: \"26bc4724-af08-4012-9656-d1cd06b533ef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5lscq" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.104631 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85f89a8e-5f37-458f-9896-fe3940cc68b6-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mncn2\" (UID: \"85f89a8e-5f37-458f-9896-fe3940cc68b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mncn2" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.104661 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ea1dde7c-e014-4b78-b8e9-0fc5a0e161ac-signing-key\") pod \"service-ca-9c57cc56f-z9mdp\" (UID: \"ea1dde7c-e014-4b78-b8e9-0fc5a0e161ac\") " pod="openshift-service-ca/service-ca-9c57cc56f-z9mdp" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.104675 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ea1dde7c-e014-4b78-b8e9-0fc5a0e161ac-signing-cabundle\") pod \"service-ca-9c57cc56f-z9mdp\" (UID: \"ea1dde7c-e014-4b78-b8e9-0fc5a0e161ac\") " pod="openshift-service-ca/service-ca-9c57cc56f-z9mdp" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.104700 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfkkd\" (UniqueName: \"kubernetes.io/projected/26bc4724-af08-4012-9656-d1cd06b533ef-kube-api-access-xfkkd\") pod \"olm-operator-6b444d44fb-5lscq\" (UID: \"26bc4724-af08-4012-9656-d1cd06b533ef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5lscq" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.104736 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ksf5\" (UniqueName: \"kubernetes.io/projected/4c51dda7-332e-497f-96ed-932d5349ee59-kube-api-access-2ksf5\") pod \"cluster-image-registry-operator-dc59b4c8b-fwcqc\" (UID: \"4c51dda7-332e-497f-96ed-932d5349ee59\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fwcqc" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.104771 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7s2x\" (UniqueName: \"kubernetes.io/projected/63503165-1bff-42d3-99f4-2af2d7f490ec-kube-api-access-c7s2x\") pod \"service-ca-operator-777779d784-mqkwn\" (UID: \"63503165-1bff-42d3-99f4-2af2d7f490ec\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mqkwn" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.104793 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c51dda7-332e-497f-96ed-932d5349ee59-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fwcqc\" (UID: \"4c51dda7-332e-497f-96ed-932d5349ee59\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fwcqc" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.106142 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c51dda7-332e-497f-96ed-932d5349ee59-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fwcqc\" (UID: \"4c51dda7-332e-497f-96ed-932d5349ee59\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fwcqc" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.107154 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c51dda7-332e-497f-96ed-932d5349ee59-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fwcqc\" (UID: \"4c51dda7-332e-497f-96ed-932d5349ee59\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fwcqc" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.109806 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.128901 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.149421 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.169817 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.189886 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.209254 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.229089 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.250047 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.269630 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.289455 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.343249 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cw6g\" (UniqueName: \"kubernetes.io/projected/2fb43f32-6ad4-4450-8a05-80570020d5e8-kube-api-access-2cw6g\") pod \"machine-api-operator-5694c8668f-4lv9x\" (UID: \"2fb43f32-6ad4-4450-8a05-80570020d5e8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4lv9x" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.353977 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n62bz\" (UniqueName: \"kubernetes.io/projected/ff7ef449-b6c7-4e55-886d-b66dd8327e7b-kube-api-access-n62bz\") pod \"route-controller-manager-6576b87f9c-m8vl7\" (UID: \"ff7ef449-b6c7-4e55-886d-b66dd8327e7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.373660 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph5pw\" (UniqueName: \"kubernetes.io/projected/2bcd213c-83e6-4a23-9542-727b1254b17d-kube-api-access-ph5pw\") pod \"machine-approver-56656f9798-lgt64\" (UID: \"2bcd213c-83e6-4a23-9542-727b1254b17d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lgt64" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.384923 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtgwl\" (UniqueName: \"kubernetes.io/projected/f2f25243-2a0b-498f-8de6-8b0a21c72c49-kube-api-access-xtgwl\") pod \"oauth-openshift-558db77b4-77ttc\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.389717 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.406568 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-4lv9x" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.409201 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.429886 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.437232 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.449910 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.470138 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.490970 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.494058 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lgt64" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.511514 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Sep 30 07:35:55 crc kubenswrapper[4760]: W0930 07:35:55.517031 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bcd213c_83e6_4a23_9542_727b1254b17d.slice/crio-cc0622426cf1bc3c85c9ca159016d6f5d7b44f66c72c2478bda0b646e4416f02 WatchSource:0}: Error finding container cc0622426cf1bc3c85c9ca159016d6f5d7b44f66c72c2478bda0b646e4416f02: Status 404 returned error can't find the container with id cc0622426cf1bc3c85c9ca159016d6f5d7b44f66c72c2478bda0b646e4416f02 Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.530544 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.554261 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.581610 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvmm2\" (UniqueName: \"kubernetes.io/projected/3f80e44c-0738-48d0-b6e5-783d696c5ec4-kube-api-access-mvmm2\") pod \"authentication-operator-69f744f599-q5spr\" (UID: \"3f80e44c-0738-48d0-b6e5-783d696c5ec4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q5spr" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.595442 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djkdl\" (UniqueName: \"kubernetes.io/projected/fa5b63b3-2bc6-496c-8841-471e2f43021c-kube-api-access-djkdl\") pod \"controller-manager-879f6c89f-6cvdd\" (UID: \"fa5b63b3-2bc6-496c-8841-471e2f43021c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.614906 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f56f\" (UniqueName: \"kubernetes.io/projected/179c4bc2-b28d-445b-98f2-aa307d57cd9f-kube-api-access-4f56f\") pod \"apiserver-76f77b778f-bhzlk\" (UID: \"179c4bc2-b28d-445b-98f2-aa307d57cd9f\") " pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.626022 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cmtf\" (UniqueName: \"kubernetes.io/projected/7c05864f-63f6-4fdf-9207-0d63dd89fc49-kube-api-access-9cmtf\") pod \"apiserver-7bbb656c7d-qc22x\" (UID: \"7c05864f-63f6-4fdf-9207-0d63dd89fc49\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.648933 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8vvk\" (UniqueName: \"kubernetes.io/projected/cbe4e2d5-6999-435b-b0af-d585a0ef1f5f-kube-api-access-c8vvk\") pod \"openshift-apiserver-operator-796bbdcf4f-ntj6x\" (UID: \"cbe4e2d5-6999-435b-b0af-d585a0ef1f5f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ntj6x" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.649234 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.652523 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.669519 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.670723 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.685753 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7"] Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.695393 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.712138 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Sep 30 07:35:55 crc kubenswrapper[4760]: W0930 07:35:55.725098 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff7ef449_b6c7_4e55_886d_b66dd8327e7b.slice/crio-f861ceb180324f3be1769729c36af6fbc0e81f22726b4f24d612254489bc51d0 WatchSource:0}: Error finding container f861ceb180324f3be1769729c36af6fbc0e81f22726b4f24d612254489bc51d0: Status 404 returned error can't find the container with id f861ceb180324f3be1769729c36af6fbc0e81f22726b4f24d612254489bc51d0 Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.736637 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4lv9x"] Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.737803 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.749780 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.752888 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.759801 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-q5spr" Sep 30 07:35:55 crc kubenswrapper[4760]: W0930 07:35:55.769903 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fb43f32_6ad4_4450_8a05_80570020d5e8.slice/crio-cb9b9037753cc166dfaf010949a149889e643b06c66a573b4a410aceb59c2eef WatchSource:0}: Error finding container cb9b9037753cc166dfaf010949a149889e643b06c66a573b4a410aceb59c2eef: Status 404 returned error can't find the container with id cb9b9037753cc166dfaf010949a149889e643b06c66a573b4a410aceb59c2eef Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.771076 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.771282 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ntj6x" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.778969 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/26bc4724-af08-4012-9656-d1cd06b533ef-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5lscq\" (UID: \"26bc4724-af08-4012-9656-d1cd06b533ef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5lscq" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.789495 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.810460 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.815569 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ea1dde7c-e014-4b78-b8e9-0fc5a0e161ac-signing-cabundle\") pod \"service-ca-9c57cc56f-z9mdp\" (UID: \"ea1dde7c-e014-4b78-b8e9-0fc5a0e161ac\") " pod="openshift-service-ca/service-ca-9c57cc56f-z9mdp" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.819616 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-77ttc"] Sep 30 07:35:55 crc kubenswrapper[4760]: W0930 07:35:55.826709 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2f25243_2a0b_498f_8de6_8b0a21c72c49.slice/crio-e36a05fee101b4f8fabf93553fe763dedb5db45b5cc415b145b5e1ab8c7eea5f WatchSource:0}: Error finding container e36a05fee101b4f8fabf93553fe763dedb5db45b5cc415b145b5e1ab8c7eea5f: Status 404 returned error can't find the container with id e36a05fee101b4f8fabf93553fe763dedb5db45b5cc415b145b5e1ab8c7eea5f Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.829742 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.840601 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ea1dde7c-e014-4b78-b8e9-0fc5a0e161ac-signing-key\") pod \"service-ca-9c57cc56f-z9mdp\" (UID: \"ea1dde7c-e014-4b78-b8e9-0fc5a0e161ac\") " pod="openshift-service-ca/service-ca-9c57cc56f-z9mdp" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.857631 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.876998 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.893380 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.907906 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bhzlk"] Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.910160 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.932806 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7" event={"ID":"ff7ef449-b6c7-4e55-886d-b66dd8327e7b","Type":"ContainerStarted","Data":"f861ceb180324f3be1769729c36af6fbc0e81f22726b4f24d612254489bc51d0"} Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.935234 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.937420 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" event={"ID":"f2f25243-2a0b-498f-8de6-8b0a21c72c49","Type":"ContainerStarted","Data":"e36a05fee101b4f8fabf93553fe763dedb5db45b5cc415b145b5e1ab8c7eea5f"} Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.939722 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lgt64" event={"ID":"2bcd213c-83e6-4a23-9542-727b1254b17d","Type":"ContainerStarted","Data":"6110a1b93ba6577b42204e73d3d6003883a3f2a8efc5e0887db5ee3b74b7d8f8"} Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.939752 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lgt64" event={"ID":"2bcd213c-83e6-4a23-9542-727b1254b17d","Type":"ContainerStarted","Data":"cc0622426cf1bc3c85c9ca159016d6f5d7b44f66c72c2478bda0b646e4416f02"} Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.948131 4760 request.go:700] Waited for 1.015999203s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmcc-proxy-tls&limit=500&resourceVersion=0 Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.949349 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.955934 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4lv9x" event={"ID":"2fb43f32-6ad4-4450-8a05-80570020d5e8","Type":"ContainerStarted","Data":"255a2865b5bd2b26c838d970a369b778e34223a62cd7ab8f5ce578fd1218cf9b"} Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.955988 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4lv9x" event={"ID":"2fb43f32-6ad4-4450-8a05-80570020d5e8","Type":"ContainerStarted","Data":"cb9b9037753cc166dfaf010949a149889e643b06c66a573b4a410aceb59c2eef"} Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.956003 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6cvdd"] Sep 30 07:35:55 crc kubenswrapper[4760]: W0930 07:35:55.966391 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod179c4bc2_b28d_445b_98f2_aa307d57cd9f.slice/crio-f1545e716e6f8979fec6823b868c8453cf9f607c016645da029c4d9a91296fcc WatchSource:0}: Error finding container f1545e716e6f8979fec6823b868c8453cf9f607c016645da029c4d9a91296fcc: Status 404 returned error can't find the container with id f1545e716e6f8979fec6823b868c8453cf9f607c016645da029c4d9a91296fcc Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.970025 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Sep 30 07:35:55 crc kubenswrapper[4760]: W0930 07:35:55.979319 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa5b63b3_2bc6_496c_8841_471e2f43021c.slice/crio-f431cc1791f296a50a8a526b7bf95cb78b3ebb7c315c533a717ccd1e25ba54e4 WatchSource:0}: Error finding container f431cc1791f296a50a8a526b7bf95cb78b3ebb7c315c533a717ccd1e25ba54e4: Status 404 returned error can't find the container with id f431cc1791f296a50a8a526b7bf95cb78b3ebb7c315c533a717ccd1e25ba54e4 Sep 30 07:35:55 crc kubenswrapper[4760]: I0930 07:35:55.989176 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.010150 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.028770 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.040623 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x"] Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.050111 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Sep 30 07:35:56 crc kubenswrapper[4760]: W0930 07:35:56.061416 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c05864f_63f6_4fdf_9207_0d63dd89fc49.slice/crio-c7462bdb83eee92a2a47d91fe048ff1e9c4ddd2a4f7a5a7650712d3a1b4e5784 WatchSource:0}: Error finding container c7462bdb83eee92a2a47d91fe048ff1e9c4ddd2a4f7a5a7650712d3a1b4e5784: Status 404 returned error can't find the container with id c7462bdb83eee92a2a47d91fe048ff1e9c4ddd2a4f7a5a7650712d3a1b4e5784 Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.076127 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.090175 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.096531 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-q5spr"] Sep 30 07:35:56 crc kubenswrapper[4760]: W0930 07:35:56.101617 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f80e44c_0738_48d0_b6e5_783d696c5ec4.slice/crio-67bd7b7c0c2318eeb7d5e87b44fdbbbdc43d53b84fdce00bf06d1021a14e5913 WatchSource:0}: Error finding container 67bd7b7c0c2318eeb7d5e87b44fdbbbdc43d53b84fdce00bf06d1021a14e5913: Status 404 returned error can't find the container with id 67bd7b7c0c2318eeb7d5e87b44fdbbbdc43d53b84fdce00bf06d1021a14e5913 Sep 30 07:35:56 crc kubenswrapper[4760]: E0930 07:35:56.106507 4760 secret.go:188] Couldn't get secret openshift-kube-scheduler-operator/kube-scheduler-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Sep 30 07:35:56 crc kubenswrapper[4760]: E0930 07:35:56.106547 4760 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Sep 30 07:35:56 crc kubenswrapper[4760]: E0930 07:35:56.106575 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85f89a8e-5f37-458f-9896-fe3940cc68b6-serving-cert podName:85f89a8e-5f37-458f-9896-fe3940cc68b6 nodeName:}" failed. No retries permitted until 2025-09-30 07:35:56.606556594 +0000 UTC m=+142.249463006 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/85f89a8e-5f37-458f-9896-fe3940cc68b6-serving-cert") pod "openshift-kube-scheduler-operator-5fdd9b5758-mncn2" (UID: "85f89a8e-5f37-458f-9896-fe3940cc68b6") : failed to sync secret cache: timed out waiting for the condition Sep 30 07:35:56 crc kubenswrapper[4760]: E0930 07:35:56.106628 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/63503165-1bff-42d3-99f4-2af2d7f490ec-config podName:63503165-1bff-42d3-99f4-2af2d7f490ec nodeName:}" failed. No retries permitted until 2025-09-30 07:35:56.606605345 +0000 UTC m=+142.249511837 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/63503165-1bff-42d3-99f4-2af2d7f490ec-config") pod "service-ca-operator-777779d784-mqkwn" (UID: "63503165-1bff-42d3-99f4-2af2d7f490ec") : failed to sync configmap cache: timed out waiting for the condition Sep 30 07:35:56 crc kubenswrapper[4760]: E0930 07:35:56.106785 4760 configmap.go:193] Couldn't get configMap openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-config: failed to sync configmap cache: timed out waiting for the condition Sep 30 07:35:56 crc kubenswrapper[4760]: E0930 07:35:56.106846 4760 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Sep 30 07:35:56 crc kubenswrapper[4760]: E0930 07:35:56.106997 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/85f89a8e-5f37-458f-9896-fe3940cc68b6-config podName:85f89a8e-5f37-458f-9896-fe3940cc68b6 nodeName:}" failed. No retries permitted until 2025-09-30 07:35:56.606804281 +0000 UTC m=+142.249710693 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/85f89a8e-5f37-458f-9896-fe3940cc68b6-config") pod "openshift-kube-scheduler-operator-5fdd9b5758-mncn2" (UID: "85f89a8e-5f37-458f-9896-fe3940cc68b6") : failed to sync configmap cache: timed out waiting for the condition Sep 30 07:35:56 crc kubenswrapper[4760]: E0930 07:35:56.107024 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26bc4724-af08-4012-9656-d1cd06b533ef-srv-cert podName:26bc4724-af08-4012-9656-d1cd06b533ef nodeName:}" failed. No retries permitted until 2025-09-30 07:35:56.607016317 +0000 UTC m=+142.249922729 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/26bc4724-af08-4012-9656-d1cd06b533ef-srv-cert") pod "olm-operator-6b444d44fb-5lscq" (UID: "26bc4724-af08-4012-9656-d1cd06b533ef") : failed to sync secret cache: timed out waiting for the condition Sep 30 07:35:56 crc kubenswrapper[4760]: E0930 07:35:56.107142 4760 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Sep 30 07:35:56 crc kubenswrapper[4760]: E0930 07:35:56.107255 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63503165-1bff-42d3-99f4-2af2d7f490ec-serving-cert podName:63503165-1bff-42d3-99f4-2af2d7f490ec nodeName:}" failed. No retries permitted until 2025-09-30 07:35:56.607232453 +0000 UTC m=+142.250138965 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/63503165-1bff-42d3-99f4-2af2d7f490ec-serving-cert") pod "service-ca-operator-777779d784-mqkwn" (UID: "63503165-1bff-42d3-99f4-2af2d7f490ec") : failed to sync secret cache: timed out waiting for the condition Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.109096 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.133562 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.145375 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ntj6x"] Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.151732 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.168947 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.189842 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.209332 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.232936 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.250290 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.271754 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.289939 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.308963 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.328723 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.349239 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.370145 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.389022 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.410388 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.429495 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.449779 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.469789 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.489144 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.510616 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.529900 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.549757 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.572549 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.591020 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.618424 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.629404 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.632195 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85f89a8e-5f37-458f-9896-fe3940cc68b6-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mncn2\" (UID: \"85f89a8e-5f37-458f-9896-fe3940cc68b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mncn2" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.632440 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63503165-1bff-42d3-99f4-2af2d7f490ec-config\") pod \"service-ca-operator-777779d784-mqkwn\" (UID: \"63503165-1bff-42d3-99f4-2af2d7f490ec\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mqkwn" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.632570 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63503165-1bff-42d3-99f4-2af2d7f490ec-serving-cert\") pod \"service-ca-operator-777779d784-mqkwn\" (UID: \"63503165-1bff-42d3-99f4-2af2d7f490ec\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mqkwn" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.632639 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/26bc4724-af08-4012-9656-d1cd06b533ef-srv-cert\") pod \"olm-operator-6b444d44fb-5lscq\" (UID: \"26bc4724-af08-4012-9656-d1cd06b533ef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5lscq" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.632678 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85f89a8e-5f37-458f-9896-fe3940cc68b6-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mncn2\" (UID: \"85f89a8e-5f37-458f-9896-fe3940cc68b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mncn2" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.633775 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63503165-1bff-42d3-99f4-2af2d7f490ec-config\") pod \"service-ca-operator-777779d784-mqkwn\" (UID: \"63503165-1bff-42d3-99f4-2af2d7f490ec\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mqkwn" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.638588 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/26bc4724-af08-4012-9656-d1cd06b533ef-srv-cert\") pod \"olm-operator-6b444d44fb-5lscq\" (UID: \"26bc4724-af08-4012-9656-d1cd06b533ef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5lscq" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.639985 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63503165-1bff-42d3-99f4-2af2d7f490ec-serving-cert\") pod \"service-ca-operator-777779d784-mqkwn\" (UID: \"63503165-1bff-42d3-99f4-2af2d7f490ec\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mqkwn" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.650267 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.657185 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85f89a8e-5f37-458f-9896-fe3940cc68b6-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mncn2\" (UID: \"85f89a8e-5f37-458f-9896-fe3940cc68b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mncn2" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.670197 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.692327 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.695021 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85f89a8e-5f37-458f-9896-fe3940cc68b6-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mncn2\" (UID: \"85f89a8e-5f37-458f-9896-fe3940cc68b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mncn2" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.709883 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.729887 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.750623 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.770002 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.810226 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.832155 4760 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.850453 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.871102 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.890789 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.910731 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.948602 4760 request.go:700] Waited for 1.948862672s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/serviceaccounts/openshift-controller-manager-operator/token Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.951161 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgch4\" (UniqueName: \"kubernetes.io/projected/cc4f6325-5a2a-4f15-8ee9-860617a9d7ce-kube-api-access-cgch4\") pod \"marketplace-operator-79b997595-gsddw\" (UID: \"cc4f6325-5a2a-4f15-8ee9-860617a9d7ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-gsddw" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.964671 4760 generic.go:334] "Generic (PLEG): container finished" podID="179c4bc2-b28d-445b-98f2-aa307d57cd9f" containerID="e8cc494d0980721654049dfcfe4109819d06b5225f6931b89b2fbde5567572b0" exitCode=0 Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.964738 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" event={"ID":"179c4bc2-b28d-445b-98f2-aa307d57cd9f","Type":"ContainerDied","Data":"e8cc494d0980721654049dfcfe4109819d06b5225f6931b89b2fbde5567572b0"} Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.964768 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" event={"ID":"179c4bc2-b28d-445b-98f2-aa307d57cd9f","Type":"ContainerStarted","Data":"f1545e716e6f8979fec6823b868c8453cf9f607c016645da029c4d9a91296fcc"} Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.966176 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ntj6x" event={"ID":"cbe4e2d5-6999-435b-b0af-d585a0ef1f5f","Type":"ContainerStarted","Data":"02e4d46e641265da2a179a2296b7acded914802fa4deb1df292db9f756595912"} Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.966403 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ntj6x" event={"ID":"cbe4e2d5-6999-435b-b0af-d585a0ef1f5f","Type":"ContainerStarted","Data":"2820fdcfa1dd5b62991f89ec5dd052506dfbcb2d92095c06e62352b538f5753f"} Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.968670 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-q5spr" event={"ID":"3f80e44c-0738-48d0-b6e5-783d696c5ec4","Type":"ContainerStarted","Data":"7c744cdee60ea6b9095259d81957dacdcbebaa7a0ca2770cb7963709e19b647b"} Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.968690 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-q5spr" event={"ID":"3f80e44c-0738-48d0-b6e5-783d696c5ec4","Type":"ContainerStarted","Data":"67bd7b7c0c2318eeb7d5e87b44fdbbbdc43d53b84fdce00bf06d1021a14e5913"} Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.981121 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpf4d\" (UniqueName: \"kubernetes.io/projected/d0f0f394-f395-4181-a0f9-afa9b7467013-kube-api-access-vpf4d\") pod \"openshift-controller-manager-operator-756b6f6bc6-75sr2\" (UID: \"d0f0f394-f395-4181-a0f9-afa9b7467013\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-75sr2" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.981499 4760 generic.go:334] "Generic (PLEG): container finished" podID="7c05864f-63f6-4fdf-9207-0d63dd89fc49" containerID="9098a46a3af8ff9bcb644e3408d3fadd28f2b85e3c2354a572741adc5a8ce725" exitCode=0 Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.982495 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" event={"ID":"7c05864f-63f6-4fdf-9207-0d63dd89fc49","Type":"ContainerDied","Data":"9098a46a3af8ff9bcb644e3408d3fadd28f2b85e3c2354a572741adc5a8ce725"} Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.982588 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" event={"ID":"7c05864f-63f6-4fdf-9207-0d63dd89fc49","Type":"ContainerStarted","Data":"c7462bdb83eee92a2a47d91fe048ff1e9c4ddd2a4f7a5a7650712d3a1b4e5784"} Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.984934 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcddb\" (UniqueName: \"kubernetes.io/projected/09aeaa85-1a1d-426f-b7f2-611f67942f2c-kube-api-access-kcddb\") pod \"kube-storage-version-migrator-operator-b67b599dd-dmqhx\" (UID: \"09aeaa85-1a1d-426f-b7f2-611f67942f2c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmqhx" Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.990466 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" event={"ID":"f2f25243-2a0b-498f-8de6-8b0a21c72c49","Type":"ContainerStarted","Data":"323a28996706e5ebe7872e9428bd083a04ad7970621cd63de64d2d6cf9c3fad6"} Sep 30 07:35:56 crc kubenswrapper[4760]: I0930 07:35:56.991373 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.009194 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lgt64" event={"ID":"2bcd213c-83e6-4a23-9542-727b1254b17d","Type":"ContainerStarted","Data":"411dc5ffd3620fdca07bd2b67d6105b1be3909a2a66c3a44514e5cc6bf65800f"} Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.020710 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm59j\" (UniqueName: \"kubernetes.io/projected/a96a9516-5f80-4391-a1f2-f4b7531e65fa-kube-api-access-tm59j\") pod \"control-plane-machine-set-operator-78cbb6b69f-tlrrq\" (UID: \"a96a9516-5f80-4391-a1f2-f4b7531e65fa\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tlrrq" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.023984 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8e4c5c1-b6db-4baf-8bbe-9fc39e1ed6ff-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-pznb4\" (UID: \"d8e4c5c1-b6db-4baf-8bbe-9fc39e1ed6ff\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pznb4" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.026115 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4lv9x" event={"ID":"2fb43f32-6ad4-4450-8a05-80570020d5e8","Type":"ContainerStarted","Data":"6232e75654ebce49b8ff1bbf3c6d14dff622fd0c68aaf36d68f8c9453d64f8ca"} Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.028657 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7" event={"ID":"ff7ef449-b6c7-4e55-886d-b66dd8327e7b","Type":"ContainerStarted","Data":"cc747737e43584279625d25fc7dc963aeac706b61e096c03ccadbd90231adbad"} Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.029368 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.030479 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" event={"ID":"fa5b63b3-2bc6-496c-8841-471e2f43021c","Type":"ContainerStarted","Data":"acc2e1c979d3ac62e71f41efccb9e82ae2a4395ac33f98d2ef759d67ef452324"} Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.030503 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" event={"ID":"fa5b63b3-2bc6-496c-8841-471e2f43021c","Type":"ContainerStarted","Data":"f431cc1791f296a50a8a526b7bf95cb78b3ebb7c315c533a717ccd1e25ba54e4"} Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.030927 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.034396 4760 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6cvdd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.034449 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" podUID="fa5b63b3-2bc6-496c-8841-471e2f43021c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.035623 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.069876 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbkv8\" (UniqueName: \"kubernetes.io/projected/2611e16f-2c0b-44b4-929b-21f16b1b2e4d-kube-api-access-tbkv8\") pod \"openshift-config-operator-7777fb866f-p96nl\" (UID: \"2611e16f-2c0b-44b4-929b-21f16b1b2e4d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p96nl" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.071447 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7zxl\" (UniqueName: \"kubernetes.io/projected/08d362f3-5c04-45fe-9981-ada11b028f83-kube-api-access-c7zxl\") pod \"console-f9d7485db-ntgr2\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.081419 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tlrrq" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.086567 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-75sr2" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.093571 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.093769 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmqhx" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.093910 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbt5m\" (UniqueName: \"kubernetes.io/projected/86638afb-4930-4496-a00d-8f243be3ab33-kube-api-access-xbt5m\") pod \"cluster-samples-operator-665b6dd947-cfnkh\" (UID: \"86638afb-4930-4496-a00d-8f243be3ab33\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cfnkh" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.101411 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pznb4" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.109637 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.131584 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gsddw" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.134133 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.184565 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85f89a8e-5f37-458f-9896-fe3940cc68b6-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mncn2\" (UID: \"85f89a8e-5f37-458f-9896-fe3940cc68b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mncn2" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.209966 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njj95\" (UniqueName: \"kubernetes.io/projected/ea1dde7c-e014-4b78-b8e9-0fc5a0e161ac-kube-api-access-njj95\") pod \"service-ca-9c57cc56f-z9mdp\" (UID: \"ea1dde7c-e014-4b78-b8e9-0fc5a0e161ac\") " pod="openshift-service-ca/service-ca-9c57cc56f-z9mdp" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.230633 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q45c6\" (UniqueName: \"kubernetes.io/projected/dd4856c0-fc17-49b2-b37b-b0414e6a2f48-kube-api-access-q45c6\") pod \"migrator-59844c95c7-s7mth\" (UID: \"dd4856c0-fc17-49b2-b37b-b0414e6a2f48\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s7mth" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.250275 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c51dda7-332e-497f-96ed-932d5349ee59-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fwcqc\" (UID: \"4c51dda7-332e-497f-96ed-932d5349ee59\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fwcqc" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.268739 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfkkd\" (UniqueName: \"kubernetes.io/projected/26bc4724-af08-4012-9656-d1cd06b533ef-kube-api-access-xfkkd\") pod \"olm-operator-6b444d44fb-5lscq\" (UID: \"26bc4724-af08-4012-9656-d1cd06b533ef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5lscq" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.293256 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mncn2" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.296982 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ksf5\" (UniqueName: \"kubernetes.io/projected/4c51dda7-332e-497f-96ed-932d5349ee59-kube-api-access-2ksf5\") pod \"cluster-image-registry-operator-dc59b4c8b-fwcqc\" (UID: \"4c51dda7-332e-497f-96ed-932d5349ee59\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fwcqc" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.310194 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7s2x\" (UniqueName: \"kubernetes.io/projected/63503165-1bff-42d3-99f4-2af2d7f490ec-kube-api-access-c7s2x\") pod \"service-ca-operator-777779d784-mqkwn\" (UID: \"63503165-1bff-42d3-99f4-2af2d7f490ec\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mqkwn" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.341266 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd4422f1-0405-44b0-9256-fec03b6dc2f0-bound-sa-token\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.341665 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6fddba3c-1ff7-42f4-99da-8f282c6095fc-bound-sa-token\") pod \"ingress-operator-5b745b69d9-zpzhj\" (UID: \"6fddba3c-1ff7-42f4-99da-8f282c6095fc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zpzhj" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.341698 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r4xr\" (UniqueName: \"kubernetes.io/projected/cd4422f1-0405-44b0-9256-fec03b6dc2f0-kube-api-access-5r4xr\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.341726 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210-config\") pod \"etcd-operator-b45778765-ptzmt\" (UID: \"e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ptzmt" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.341783 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q62t\" (UniqueName: \"kubernetes.io/projected/e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210-kube-api-access-9q62t\") pod \"etcd-operator-b45778765-ptzmt\" (UID: \"e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ptzmt" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.341848 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/32814054-fdc5-432f-a901-43f758af1b44-metrics-tls\") pod \"dns-operator-744455d44c-7278d\" (UID: \"32814054-fdc5-432f-a901-43f758af1b44\") " pod="openshift-dns-operator/dns-operator-744455d44c-7278d" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.341907 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/84902f21-f09f-4c6b-b92f-1569d4aa2fcc-srv-cert\") pod \"catalog-operator-68c6474976-pn92b\" (UID: \"84902f21-f09f-4c6b-b92f-1569d4aa2fcc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pn92b" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.341960 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9873ee29-2db0-462b-8c6c-6efc009193fa-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-9gnwv\" (UID: \"9873ee29-2db0-462b-8c6c-6efc009193fa\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9gnwv" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.342018 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq72b\" (UniqueName: \"kubernetes.io/projected/d2195fee-f00a-408c-a44d-c74b59078ad7-kube-api-access-sq72b\") pod \"ingress-canary-82mws\" (UID: \"d2195fee-f00a-408c-a44d-c74b59078ad7\") " pod="openshift-ingress-canary/ingress-canary-82mws" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.342049 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e98f7cd-47e7-4344-abc2-845b404874a4-serving-cert\") pod \"console-operator-58897d9998-zwnlf\" (UID: \"3e98f7cd-47e7-4344-abc2-845b404874a4\") " pod="openshift-console-operator/console-operator-58897d9998-zwnlf" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.342617 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6fddba3c-1ff7-42f4-99da-8f282c6095fc-metrics-tls\") pod \"ingress-operator-5b745b69d9-zpzhj\" (UID: \"6fddba3c-1ff7-42f4-99da-8f282c6095fc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zpzhj" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.342648 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cd4422f1-0405-44b0-9256-fec03b6dc2f0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.342668 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2195fee-f00a-408c-a44d-c74b59078ad7-cert\") pod \"ingress-canary-82mws\" (UID: \"d2195fee-f00a-408c-a44d-c74b59078ad7\") " pod="openshift-ingress-canary/ingress-canary-82mws" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.342686 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb4bc73c-b474-4efa-b348-63ff26045c24-metrics-certs\") pod \"router-default-5444994796-vjd5w\" (UID: \"eb4bc73c-b474-4efa-b348-63ff26045c24\") " pod="openshift-ingress/router-default-5444994796-vjd5w" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.342713 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210-etcd-client\") pod \"etcd-operator-b45778765-ptzmt\" (UID: \"e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ptzmt" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.342750 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n729\" (UniqueName: \"kubernetes.io/projected/3e98f7cd-47e7-4344-abc2-845b404874a4-kube-api-access-6n729\") pod \"console-operator-58897d9998-zwnlf\" (UID: \"3e98f7cd-47e7-4344-abc2-845b404874a4\") " pod="openshift-console-operator/console-operator-58897d9998-zwnlf" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.342774 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210-serving-cert\") pod \"etcd-operator-b45778765-ptzmt\" (UID: \"e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ptzmt" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.342800 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnrzj\" (UniqueName: \"kubernetes.io/projected/32814054-fdc5-432f-a901-43f758af1b44-kube-api-access-hnrzj\") pod \"dns-operator-744455d44c-7278d\" (UID: \"32814054-fdc5-432f-a901-43f758af1b44\") " pod="openshift-dns-operator/dns-operator-744455d44c-7278d" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.342835 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6sg7\" (UniqueName: \"kubernetes.io/projected/84902f21-f09f-4c6b-b92f-1569d4aa2fcc-kube-api-access-c6sg7\") pod \"catalog-operator-68c6474976-pn92b\" (UID: \"84902f21-f09f-4c6b-b92f-1569d4aa2fcc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pn92b" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.342854 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4vvx\" (UniqueName: \"kubernetes.io/projected/db316304-e02f-43df-b4e2-6cdd6ee3b7eb-kube-api-access-s4vvx\") pod \"collect-profiles-29320290-w8n6p\" (UID: \"db316304-e02f-43df-b4e2-6cdd6ee3b7eb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-w8n6p" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.342904 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a196114-d286-4138-8ffc-baeb5ecc02df-webhook-cert\") pod \"packageserver-d55dfcdfc-2fn4r\" (UID: \"7a196114-d286-4138-8ffc-baeb5ecc02df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2fn4r" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.342921 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cd4422f1-0405-44b0-9256-fec03b6dc2f0-registry-certificates\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.342975 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/23c1276d-8d7b-4ffa-9290-3bd09756c660-proxy-tls\") pod \"machine-config-controller-84d6567774-kkfmw\" (UID: \"23c1276d-8d7b-4ffa-9290-3bd09756c660\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kkfmw" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.343012 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e98f7cd-47e7-4344-abc2-845b404874a4-config\") pod \"console-operator-58897d9998-zwnlf\" (UID: \"3e98f7cd-47e7-4344-abc2-845b404874a4\") " pod="openshift-console-operator/console-operator-58897d9998-zwnlf" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.343031 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd4422f1-0405-44b0-9256-fec03b6dc2f0-trusted-ca\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.343049 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac72e4c7-2db7-411a-8f4b-28687be463f3-config\") pod \"kube-apiserver-operator-766d6c64bb-qgs9v\" (UID: \"ac72e4c7-2db7-411a-8f4b-28687be463f3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgs9v" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.343065 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/eb4bc73c-b474-4efa-b348-63ff26045c24-stats-auth\") pod \"router-default-5444994796-vjd5w\" (UID: \"eb4bc73c-b474-4efa-b348-63ff26045c24\") " pod="openshift-ingress/router-default-5444994796-vjd5w" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.343112 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/391a89e2-2467-4f7e-aa8d-d4c939845a67-images\") pod \"machine-config-operator-74547568cd-hgcbk\" (UID: \"391a89e2-2467-4f7e-aa8d-d4c939845a67\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hgcbk" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.343139 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210-etcd-ca\") pod \"etcd-operator-b45778765-ptzmt\" (UID: \"e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ptzmt" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.343173 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e98f7cd-47e7-4344-abc2-845b404874a4-trusted-ca\") pod \"console-operator-58897d9998-zwnlf\" (UID: \"3e98f7cd-47e7-4344-abc2-845b404874a4\") " pod="openshift-console-operator/console-operator-58897d9998-zwnlf" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.343192 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7a196114-d286-4138-8ffc-baeb5ecc02df-tmpfs\") pod \"packageserver-d55dfcdfc-2fn4r\" (UID: \"7a196114-d286-4138-8ffc-baeb5ecc02df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2fn4r" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.343208 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac72e4c7-2db7-411a-8f4b-28687be463f3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qgs9v\" (UID: \"ac72e4c7-2db7-411a-8f4b-28687be463f3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgs9v" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.343224 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8qh7\" (UniqueName: \"kubernetes.io/projected/85616cb5-98c5-4296-858a-462f8ca42702-kube-api-access-k8qh7\") pod \"multus-admission-controller-857f4d67dd-n5tk5\" (UID: \"85616cb5-98c5-4296-858a-462f8ca42702\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n5tk5" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.343241 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqlvh\" (UniqueName: \"kubernetes.io/projected/391a89e2-2467-4f7e-aa8d-d4c939845a67-kube-api-access-rqlvh\") pod \"machine-config-operator-74547568cd-hgcbk\" (UID: \"391a89e2-2467-4f7e-aa8d-d4c939845a67\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hgcbk" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.343259 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cd4422f1-0405-44b0-9256-fec03b6dc2f0-registry-tls\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.343286 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210-etcd-service-ca\") pod \"etcd-operator-b45778765-ptzmt\" (UID: \"e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ptzmt" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.343357 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a196114-d286-4138-8ffc-baeb5ecc02df-apiservice-cert\") pod \"packageserver-d55dfcdfc-2fn4r\" (UID: \"7a196114-d286-4138-8ffc-baeb5ecc02df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2fn4r" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.343410 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/eb4bc73c-b474-4efa-b348-63ff26045c24-default-certificate\") pod \"router-default-5444994796-vjd5w\" (UID: \"eb4bc73c-b474-4efa-b348-63ff26045c24\") " pod="openshift-ingress/router-default-5444994796-vjd5w" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.343434 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.343454 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psh8s\" (UniqueName: \"kubernetes.io/projected/7a196114-d286-4138-8ffc-baeb5ecc02df-kube-api-access-psh8s\") pod \"packageserver-d55dfcdfc-2fn4r\" (UID: \"7a196114-d286-4138-8ffc-baeb5ecc02df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2fn4r" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.343472 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp2gs\" (UniqueName: \"kubernetes.io/projected/3b428f41-cff0-421d-a763-987d15be26eb-kube-api-access-dp2gs\") pod \"downloads-7954f5f757-jx7lt\" (UID: \"3b428f41-cff0-421d-a763-987d15be26eb\") " pod="openshift-console/downloads-7954f5f757-jx7lt" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.343488 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpszj\" (UniqueName: \"kubernetes.io/projected/eb4bc73c-b474-4efa-b348-63ff26045c24-kube-api-access-dpszj\") pod \"router-default-5444994796-vjd5w\" (UID: \"eb4bc73c-b474-4efa-b348-63ff26045c24\") " pod="openshift-ingress/router-default-5444994796-vjd5w" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.343536 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb4bc73c-b474-4efa-b348-63ff26045c24-service-ca-bundle\") pod \"router-default-5444994796-vjd5w\" (UID: \"eb4bc73c-b474-4efa-b348-63ff26045c24\") " pod="openshift-ingress/router-default-5444994796-vjd5w" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.343572 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ac72e4c7-2db7-411a-8f4b-28687be463f3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qgs9v\" (UID: \"ac72e4c7-2db7-411a-8f4b-28687be463f3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgs9v" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.343606 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2frr\" (UniqueName: \"kubernetes.io/projected/6fddba3c-1ff7-42f4-99da-8f282c6095fc-kube-api-access-j2frr\") pod \"ingress-operator-5b745b69d9-zpzhj\" (UID: \"6fddba3c-1ff7-42f4-99da-8f282c6095fc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zpzhj" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.343918 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/391a89e2-2467-4f7e-aa8d-d4c939845a67-proxy-tls\") pod \"machine-config-operator-74547568cd-hgcbk\" (UID: \"391a89e2-2467-4f7e-aa8d-d4c939845a67\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hgcbk" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.343970 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/85616cb5-98c5-4296-858a-462f8ca42702-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-n5tk5\" (UID: \"85616cb5-98c5-4296-858a-462f8ca42702\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n5tk5" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.344013 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db316304-e02f-43df-b4e2-6cdd6ee3b7eb-secret-volume\") pod \"collect-profiles-29320290-w8n6p\" (UID: \"db316304-e02f-43df-b4e2-6cdd6ee3b7eb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-w8n6p" Sep 30 07:35:57 crc kubenswrapper[4760]: E0930 07:35:57.344033 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:35:57.844022227 +0000 UTC m=+143.486928639 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.344064 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cd4422f1-0405-44b0-9256-fec03b6dc2f0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.344083 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzx7d\" (UniqueName: \"kubernetes.io/projected/23c1276d-8d7b-4ffa-9290-3bd09756c660-kube-api-access-fzx7d\") pod \"machine-config-controller-84d6567774-kkfmw\" (UID: \"23c1276d-8d7b-4ffa-9290-3bd09756c660\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kkfmw" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.344100 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/23c1276d-8d7b-4ffa-9290-3bd09756c660-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kkfmw\" (UID: \"23c1276d-8d7b-4ffa-9290-3bd09756c660\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kkfmw" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.344116 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/391a89e2-2467-4f7e-aa8d-d4c939845a67-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hgcbk\" (UID: \"391a89e2-2467-4f7e-aa8d-d4c939845a67\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hgcbk" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.344133 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db316304-e02f-43df-b4e2-6cdd6ee3b7eb-config-volume\") pod \"collect-profiles-29320290-w8n6p\" (UID: \"db316304-e02f-43df-b4e2-6cdd6ee3b7eb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-w8n6p" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.344167 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6fddba3c-1ff7-42f4-99da-8f282c6095fc-trusted-ca\") pod \"ingress-operator-5b745b69d9-zpzhj\" (UID: \"6fddba3c-1ff7-42f4-99da-8f282c6095fc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zpzhj" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.344187 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/84902f21-f09f-4c6b-b92f-1569d4aa2fcc-profile-collector-cert\") pod \"catalog-operator-68c6474976-pn92b\" (UID: \"84902f21-f09f-4c6b-b92f-1569d4aa2fcc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pn92b" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.344208 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gpn2\" (UniqueName: \"kubernetes.io/projected/9873ee29-2db0-462b-8c6c-6efc009193fa-kube-api-access-5gpn2\") pod \"package-server-manager-789f6589d5-9gnwv\" (UID: \"9873ee29-2db0-462b-8c6c-6efc009193fa\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9gnwv" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.365594 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p96nl" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.366582 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cfnkh" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.370879 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.433777 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fwcqc" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.446512 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.446733 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a196114-d286-4138-8ffc-baeb5ecc02df-apiservice-cert\") pod \"packageserver-d55dfcdfc-2fn4r\" (UID: \"7a196114-d286-4138-8ffc-baeb5ecc02df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2fn4r" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.446756 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/eb4bc73c-b474-4efa-b348-63ff26045c24-default-certificate\") pod \"router-default-5444994796-vjd5w\" (UID: \"eb4bc73c-b474-4efa-b348-63ff26045c24\") " pod="openshift-ingress/router-default-5444994796-vjd5w" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.446778 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86c4ea2a-21a5-46e4-bfd9-28d7a429b7ae-config-volume\") pod \"dns-default-76w2l\" (UID: \"86c4ea2a-21a5-46e4-bfd9-28d7a429b7ae\") " pod="openshift-dns/dns-default-76w2l" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.446829 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psh8s\" (UniqueName: \"kubernetes.io/projected/7a196114-d286-4138-8ffc-baeb5ecc02df-kube-api-access-psh8s\") pod \"packageserver-d55dfcdfc-2fn4r\" (UID: \"7a196114-d286-4138-8ffc-baeb5ecc02df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2fn4r" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.446851 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp2gs\" (UniqueName: \"kubernetes.io/projected/3b428f41-cff0-421d-a763-987d15be26eb-kube-api-access-dp2gs\") pod \"downloads-7954f5f757-jx7lt\" (UID: \"3b428f41-cff0-421d-a763-987d15be26eb\") " pod="openshift-console/downloads-7954f5f757-jx7lt" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.446866 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpszj\" (UniqueName: \"kubernetes.io/projected/eb4bc73c-b474-4efa-b348-63ff26045c24-kube-api-access-dpszj\") pod \"router-default-5444994796-vjd5w\" (UID: \"eb4bc73c-b474-4efa-b348-63ff26045c24\") " pod="openshift-ingress/router-default-5444994796-vjd5w" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.446894 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb4bc73c-b474-4efa-b348-63ff26045c24-service-ca-bundle\") pod \"router-default-5444994796-vjd5w\" (UID: \"eb4bc73c-b474-4efa-b348-63ff26045c24\") " pod="openshift-ingress/router-default-5444994796-vjd5w" Sep 30 07:35:57 crc kubenswrapper[4760]: E0930 07:35:57.448068 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:35:57.94804278 +0000 UTC m=+143.590949242 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.446924 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ac72e4c7-2db7-411a-8f4b-28687be463f3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qgs9v\" (UID: \"ac72e4c7-2db7-411a-8f4b-28687be463f3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgs9v" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.448894 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2frr\" (UniqueName: \"kubernetes.io/projected/6fddba3c-1ff7-42f4-99da-8f282c6095fc-kube-api-access-j2frr\") pod \"ingress-operator-5b745b69d9-zpzhj\" (UID: \"6fddba3c-1ff7-42f4-99da-8f282c6095fc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zpzhj" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.448915 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/391a89e2-2467-4f7e-aa8d-d4c939845a67-proxy-tls\") pod \"machine-config-operator-74547568cd-hgcbk\" (UID: \"391a89e2-2467-4f7e-aa8d-d4c939845a67\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hgcbk" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.448933 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/de49f8fe-30f8-44ae-beaa-fe61cf7b0a16-registration-dir\") pod \"csi-hostpathplugin-zrlmg\" (UID: \"de49f8fe-30f8-44ae-beaa-fe61cf7b0a16\") " pod="hostpath-provisioner/csi-hostpathplugin-zrlmg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.448951 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/85616cb5-98c5-4296-858a-462f8ca42702-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-n5tk5\" (UID: \"85616cb5-98c5-4296-858a-462f8ca42702\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n5tk5" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.448992 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db316304-e02f-43df-b4e2-6cdd6ee3b7eb-secret-volume\") pod \"collect-profiles-29320290-w8n6p\" (UID: \"db316304-e02f-43df-b4e2-6cdd6ee3b7eb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-w8n6p" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449010 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/de49f8fe-30f8-44ae-beaa-fe61cf7b0a16-csi-data-dir\") pod \"csi-hostpathplugin-zrlmg\" (UID: \"de49f8fe-30f8-44ae-beaa-fe61cf7b0a16\") " pod="hostpath-provisioner/csi-hostpathplugin-zrlmg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449039 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cd4422f1-0405-44b0-9256-fec03b6dc2f0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449056 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzx7d\" (UniqueName: \"kubernetes.io/projected/23c1276d-8d7b-4ffa-9290-3bd09756c660-kube-api-access-fzx7d\") pod \"machine-config-controller-84d6567774-kkfmw\" (UID: \"23c1276d-8d7b-4ffa-9290-3bd09756c660\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kkfmw" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449073 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/23c1276d-8d7b-4ffa-9290-3bd09756c660-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kkfmw\" (UID: \"23c1276d-8d7b-4ffa-9290-3bd09756c660\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kkfmw" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449088 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/391a89e2-2467-4f7e-aa8d-d4c939845a67-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hgcbk\" (UID: \"391a89e2-2467-4f7e-aa8d-d4c939845a67\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hgcbk" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449102 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db316304-e02f-43df-b4e2-6cdd6ee3b7eb-config-volume\") pod \"collect-profiles-29320290-w8n6p\" (UID: \"db316304-e02f-43df-b4e2-6cdd6ee3b7eb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-w8n6p" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449116 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86c4ea2a-21a5-46e4-bfd9-28d7a429b7ae-metrics-tls\") pod \"dns-default-76w2l\" (UID: \"86c4ea2a-21a5-46e4-bfd9-28d7a429b7ae\") " pod="openshift-dns/dns-default-76w2l" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449193 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6fddba3c-1ff7-42f4-99da-8f282c6095fc-trusted-ca\") pod \"ingress-operator-5b745b69d9-zpzhj\" (UID: \"6fddba3c-1ff7-42f4-99da-8f282c6095fc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zpzhj" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449220 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gpn2\" (UniqueName: \"kubernetes.io/projected/9873ee29-2db0-462b-8c6c-6efc009193fa-kube-api-access-5gpn2\") pod \"package-server-manager-789f6589d5-9gnwv\" (UID: \"9873ee29-2db0-462b-8c6c-6efc009193fa\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9gnwv" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449242 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/de49f8fe-30f8-44ae-beaa-fe61cf7b0a16-socket-dir\") pod \"csi-hostpathplugin-zrlmg\" (UID: \"de49f8fe-30f8-44ae-beaa-fe61cf7b0a16\") " pod="hostpath-provisioner/csi-hostpathplugin-zrlmg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449274 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/84902f21-f09f-4c6b-b92f-1569d4aa2fcc-profile-collector-cert\") pod \"catalog-operator-68c6474976-pn92b\" (UID: \"84902f21-f09f-4c6b-b92f-1569d4aa2fcc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pn92b" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449291 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4f24c041-1e7c-4c07-8139-d8d47f9d3539-node-bootstrap-token\") pod \"machine-config-server-wtpqg\" (UID: \"4f24c041-1e7c-4c07-8139-d8d47f9d3539\") " pod="openshift-machine-config-operator/machine-config-server-wtpqg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449335 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd4422f1-0405-44b0-9256-fec03b6dc2f0-bound-sa-token\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449351 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6fddba3c-1ff7-42f4-99da-8f282c6095fc-bound-sa-token\") pod \"ingress-operator-5b745b69d9-zpzhj\" (UID: \"6fddba3c-1ff7-42f4-99da-8f282c6095fc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zpzhj" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449406 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r4xr\" (UniqueName: \"kubernetes.io/projected/cd4422f1-0405-44b0-9256-fec03b6dc2f0-kube-api-access-5r4xr\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449425 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsr5t\" (UniqueName: \"kubernetes.io/projected/de49f8fe-30f8-44ae-beaa-fe61cf7b0a16-kube-api-access-vsr5t\") pod \"csi-hostpathplugin-zrlmg\" (UID: \"de49f8fe-30f8-44ae-beaa-fe61cf7b0a16\") " pod="hostpath-provisioner/csi-hostpathplugin-zrlmg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449461 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210-config\") pod \"etcd-operator-b45778765-ptzmt\" (UID: \"e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ptzmt" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449480 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q62t\" (UniqueName: \"kubernetes.io/projected/e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210-kube-api-access-9q62t\") pod \"etcd-operator-b45778765-ptzmt\" (UID: \"e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ptzmt" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449528 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/32814054-fdc5-432f-a901-43f758af1b44-metrics-tls\") pod \"dns-operator-744455d44c-7278d\" (UID: \"32814054-fdc5-432f-a901-43f758af1b44\") " pod="openshift-dns-operator/dns-operator-744455d44c-7278d" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449555 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/84902f21-f09f-4c6b-b92f-1569d4aa2fcc-srv-cert\") pod \"catalog-operator-68c6474976-pn92b\" (UID: \"84902f21-f09f-4c6b-b92f-1569d4aa2fcc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pn92b" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449571 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9873ee29-2db0-462b-8c6c-6efc009193fa-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-9gnwv\" (UID: \"9873ee29-2db0-462b-8c6c-6efc009193fa\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9gnwv" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449610 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lwx6\" (UniqueName: \"kubernetes.io/projected/86c4ea2a-21a5-46e4-bfd9-28d7a429b7ae-kube-api-access-9lwx6\") pod \"dns-default-76w2l\" (UID: \"86c4ea2a-21a5-46e4-bfd9-28d7a429b7ae\") " pod="openshift-dns/dns-default-76w2l" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449628 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq72b\" (UniqueName: \"kubernetes.io/projected/d2195fee-f00a-408c-a44d-c74b59078ad7-kube-api-access-sq72b\") pod \"ingress-canary-82mws\" (UID: \"d2195fee-f00a-408c-a44d-c74b59078ad7\") " pod="openshift-ingress-canary/ingress-canary-82mws" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449646 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e98f7cd-47e7-4344-abc2-845b404874a4-serving-cert\") pod \"console-operator-58897d9998-zwnlf\" (UID: \"3e98f7cd-47e7-4344-abc2-845b404874a4\") " pod="openshift-console-operator/console-operator-58897d9998-zwnlf" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449662 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6fddba3c-1ff7-42f4-99da-8f282c6095fc-metrics-tls\") pod \"ingress-operator-5b745b69d9-zpzhj\" (UID: \"6fddba3c-1ff7-42f4-99da-8f282c6095fc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zpzhj" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449697 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb4bc73c-b474-4efa-b348-63ff26045c24-metrics-certs\") pod \"router-default-5444994796-vjd5w\" (UID: \"eb4bc73c-b474-4efa-b348-63ff26045c24\") " pod="openshift-ingress/router-default-5444994796-vjd5w" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449731 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cd4422f1-0405-44b0-9256-fec03b6dc2f0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449746 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2195fee-f00a-408c-a44d-c74b59078ad7-cert\") pod \"ingress-canary-82mws\" (UID: \"d2195fee-f00a-408c-a44d-c74b59078ad7\") " pod="openshift-ingress-canary/ingress-canary-82mws" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449784 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/de49f8fe-30f8-44ae-beaa-fe61cf7b0a16-plugins-dir\") pod \"csi-hostpathplugin-zrlmg\" (UID: \"de49f8fe-30f8-44ae-beaa-fe61cf7b0a16\") " pod="hostpath-provisioner/csi-hostpathplugin-zrlmg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449809 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210-etcd-client\") pod \"etcd-operator-b45778765-ptzmt\" (UID: \"e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ptzmt" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449845 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n729\" (UniqueName: \"kubernetes.io/projected/3e98f7cd-47e7-4344-abc2-845b404874a4-kube-api-access-6n729\") pod \"console-operator-58897d9998-zwnlf\" (UID: \"3e98f7cd-47e7-4344-abc2-845b404874a4\") " pod="openshift-console-operator/console-operator-58897d9998-zwnlf" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449861 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210-serving-cert\") pod \"etcd-operator-b45778765-ptzmt\" (UID: \"e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ptzmt" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449880 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnrzj\" (UniqueName: \"kubernetes.io/projected/32814054-fdc5-432f-a901-43f758af1b44-kube-api-access-hnrzj\") pod \"dns-operator-744455d44c-7278d\" (UID: \"32814054-fdc5-432f-a901-43f758af1b44\") " pod="openshift-dns-operator/dns-operator-744455d44c-7278d" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449897 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4vvx\" (UniqueName: \"kubernetes.io/projected/db316304-e02f-43df-b4e2-6cdd6ee3b7eb-kube-api-access-s4vvx\") pod \"collect-profiles-29320290-w8n6p\" (UID: \"db316304-e02f-43df-b4e2-6cdd6ee3b7eb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-w8n6p" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449917 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6sg7\" (UniqueName: \"kubernetes.io/projected/84902f21-f09f-4c6b-b92f-1569d4aa2fcc-kube-api-access-c6sg7\") pod \"catalog-operator-68c6474976-pn92b\" (UID: \"84902f21-f09f-4c6b-b92f-1569d4aa2fcc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pn92b" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449939 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a196114-d286-4138-8ffc-baeb5ecc02df-webhook-cert\") pod \"packageserver-d55dfcdfc-2fn4r\" (UID: \"7a196114-d286-4138-8ffc-baeb5ecc02df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2fn4r" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449974 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cd4422f1-0405-44b0-9256-fec03b6dc2f0-registry-certificates\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.449994 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/23c1276d-8d7b-4ffa-9290-3bd09756c660-proxy-tls\") pod \"machine-config-controller-84d6567774-kkfmw\" (UID: \"23c1276d-8d7b-4ffa-9290-3bd09756c660\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kkfmw" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.450020 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/de49f8fe-30f8-44ae-beaa-fe61cf7b0a16-mountpoint-dir\") pod \"csi-hostpathplugin-zrlmg\" (UID: \"de49f8fe-30f8-44ae-beaa-fe61cf7b0a16\") " pod="hostpath-provisioner/csi-hostpathplugin-zrlmg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.450049 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e98f7cd-47e7-4344-abc2-845b404874a4-config\") pod \"console-operator-58897d9998-zwnlf\" (UID: \"3e98f7cd-47e7-4344-abc2-845b404874a4\") " pod="openshift-console-operator/console-operator-58897d9998-zwnlf" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.450065 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/eb4bc73c-b474-4efa-b348-63ff26045c24-stats-auth\") pod \"router-default-5444994796-vjd5w\" (UID: \"eb4bc73c-b474-4efa-b348-63ff26045c24\") " pod="openshift-ingress/router-default-5444994796-vjd5w" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.450081 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4f24c041-1e7c-4c07-8139-d8d47f9d3539-certs\") pod \"machine-config-server-wtpqg\" (UID: \"4f24c041-1e7c-4c07-8139-d8d47f9d3539\") " pod="openshift-machine-config-operator/machine-config-server-wtpqg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.450106 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd4422f1-0405-44b0-9256-fec03b6dc2f0-trusted-ca\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.450121 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac72e4c7-2db7-411a-8f4b-28687be463f3-config\") pod \"kube-apiserver-operator-766d6c64bb-qgs9v\" (UID: \"ac72e4c7-2db7-411a-8f4b-28687be463f3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgs9v" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.450154 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/391a89e2-2467-4f7e-aa8d-d4c939845a67-images\") pod \"machine-config-operator-74547568cd-hgcbk\" (UID: \"391a89e2-2467-4f7e-aa8d-d4c939845a67\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hgcbk" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.450172 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210-etcd-ca\") pod \"etcd-operator-b45778765-ptzmt\" (UID: \"e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ptzmt" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.450190 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac72e4c7-2db7-411a-8f4b-28687be463f3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qgs9v\" (UID: \"ac72e4c7-2db7-411a-8f4b-28687be463f3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgs9v" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.450207 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8qh7\" (UniqueName: \"kubernetes.io/projected/85616cb5-98c5-4296-858a-462f8ca42702-kube-api-access-k8qh7\") pod \"multus-admission-controller-857f4d67dd-n5tk5\" (UID: \"85616cb5-98c5-4296-858a-462f8ca42702\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n5tk5" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.450224 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqlvh\" (UniqueName: \"kubernetes.io/projected/391a89e2-2467-4f7e-aa8d-d4c939845a67-kube-api-access-rqlvh\") pod \"machine-config-operator-74547568cd-hgcbk\" (UID: \"391a89e2-2467-4f7e-aa8d-d4c939845a67\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hgcbk" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.450246 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e98f7cd-47e7-4344-abc2-845b404874a4-trusted-ca\") pod \"console-operator-58897d9998-zwnlf\" (UID: \"3e98f7cd-47e7-4344-abc2-845b404874a4\") " pod="openshift-console-operator/console-operator-58897d9998-zwnlf" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.450261 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7a196114-d286-4138-8ffc-baeb5ecc02df-tmpfs\") pod \"packageserver-d55dfcdfc-2fn4r\" (UID: \"7a196114-d286-4138-8ffc-baeb5ecc02df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2fn4r" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.450276 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6k2w\" (UniqueName: \"kubernetes.io/projected/4f24c041-1e7c-4c07-8139-d8d47f9d3539-kube-api-access-p6k2w\") pod \"machine-config-server-wtpqg\" (UID: \"4f24c041-1e7c-4c07-8139-d8d47f9d3539\") " pod="openshift-machine-config-operator/machine-config-server-wtpqg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.450310 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cd4422f1-0405-44b0-9256-fec03b6dc2f0-registry-tls\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.450348 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210-etcd-service-ca\") pod \"etcd-operator-b45778765-ptzmt\" (UID: \"e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ptzmt" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.450903 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s7mth" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.457006 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb4bc73c-b474-4efa-b348-63ff26045c24-service-ca-bundle\") pod \"router-default-5444994796-vjd5w\" (UID: \"eb4bc73c-b474-4efa-b348-63ff26045c24\") " pod="openshift-ingress/router-default-5444994796-vjd5w" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.459933 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.460875 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a196114-d286-4138-8ffc-baeb5ecc02df-apiservice-cert\") pod \"packageserver-d55dfcdfc-2fn4r\" (UID: \"7a196114-d286-4138-8ffc-baeb5ecc02df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2fn4r" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.461888 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/eb4bc73c-b474-4efa-b348-63ff26045c24-default-certificate\") pod \"router-default-5444994796-vjd5w\" (UID: \"eb4bc73c-b474-4efa-b348-63ff26045c24\") " pod="openshift-ingress/router-default-5444994796-vjd5w" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.462018 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210-config\") pod \"etcd-operator-b45778765-ptzmt\" (UID: \"e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ptzmt" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.462946 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cd4422f1-0405-44b0-9256-fec03b6dc2f0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.463699 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cd4422f1-0405-44b0-9256-fec03b6dc2f0-registry-certificates\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.470288 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a196114-d286-4138-8ffc-baeb5ecc02df-webhook-cert\") pod \"packageserver-d55dfcdfc-2fn4r\" (UID: \"7a196114-d286-4138-8ffc-baeb5ecc02df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2fn4r" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.470585 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6fddba3c-1ff7-42f4-99da-8f282c6095fc-trusted-ca\") pod \"ingress-operator-5b745b69d9-zpzhj\" (UID: \"6fddba3c-1ff7-42f4-99da-8f282c6095fc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zpzhj" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.471070 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9873ee29-2db0-462b-8c6c-6efc009193fa-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-9gnwv\" (UID: \"9873ee29-2db0-462b-8c6c-6efc009193fa\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9gnwv" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.471496 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/23c1276d-8d7b-4ffa-9290-3bd09756c660-proxy-tls\") pod \"machine-config-controller-84d6567774-kkfmw\" (UID: \"23c1276d-8d7b-4ffa-9290-3bd09756c660\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kkfmw" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.471893 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/84902f21-f09f-4c6b-b92f-1569d4aa2fcc-profile-collector-cert\") pod \"catalog-operator-68c6474976-pn92b\" (UID: \"84902f21-f09f-4c6b-b92f-1569d4aa2fcc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pn92b" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.472691 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cd4422f1-0405-44b0-9256-fec03b6dc2f0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.473511 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/23c1276d-8d7b-4ffa-9290-3bd09756c660-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kkfmw\" (UID: \"23c1276d-8d7b-4ffa-9290-3bd09756c660\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kkfmw" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.473717 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2195fee-f00a-408c-a44d-c74b59078ad7-cert\") pod \"ingress-canary-82mws\" (UID: \"d2195fee-f00a-408c-a44d-c74b59078ad7\") " pod="openshift-ingress-canary/ingress-canary-82mws" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.475491 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/391a89e2-2467-4f7e-aa8d-d4c939845a67-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hgcbk\" (UID: \"391a89e2-2467-4f7e-aa8d-d4c939845a67\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hgcbk" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.475816 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7a196114-d286-4138-8ffc-baeb5ecc02df-tmpfs\") pod \"packageserver-d55dfcdfc-2fn4r\" (UID: \"7a196114-d286-4138-8ffc-baeb5ecc02df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2fn4r" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.475948 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db316304-e02f-43df-b4e2-6cdd6ee3b7eb-config-volume\") pod \"collect-profiles-29320290-w8n6p\" (UID: \"db316304-e02f-43df-b4e2-6cdd6ee3b7eb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-w8n6p" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.476204 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210-etcd-ca\") pod \"etcd-operator-b45778765-ptzmt\" (UID: \"e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ptzmt" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.476986 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e98f7cd-47e7-4344-abc2-845b404874a4-config\") pod \"console-operator-58897d9998-zwnlf\" (UID: \"3e98f7cd-47e7-4344-abc2-845b404874a4\") " pod="openshift-console-operator/console-operator-58897d9998-zwnlf" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.478013 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/84902f21-f09f-4c6b-b92f-1569d4aa2fcc-srv-cert\") pod \"catalog-operator-68c6474976-pn92b\" (UID: \"84902f21-f09f-4c6b-b92f-1569d4aa2fcc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pn92b" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.478460 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac72e4c7-2db7-411a-8f4b-28687be463f3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qgs9v\" (UID: \"ac72e4c7-2db7-411a-8f4b-28687be463f3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgs9v" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.479548 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e98f7cd-47e7-4344-abc2-845b404874a4-trusted-ca\") pod \"console-operator-58897d9998-zwnlf\" (UID: \"3e98f7cd-47e7-4344-abc2-845b404874a4\") " pod="openshift-console-operator/console-operator-58897d9998-zwnlf" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.487514 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb4bc73c-b474-4efa-b348-63ff26045c24-metrics-certs\") pod \"router-default-5444994796-vjd5w\" (UID: \"eb4bc73c-b474-4efa-b348-63ff26045c24\") " pod="openshift-ingress/router-default-5444994796-vjd5w" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.490475 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210-etcd-client\") pod \"etcd-operator-b45778765-ptzmt\" (UID: \"e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ptzmt" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.490776 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/391a89e2-2467-4f7e-aa8d-d4c939845a67-proxy-tls\") pod \"machine-config-operator-74547568cd-hgcbk\" (UID: \"391a89e2-2467-4f7e-aa8d-d4c939845a67\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hgcbk" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.490928 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db316304-e02f-43df-b4e2-6cdd6ee3b7eb-secret-volume\") pod \"collect-profiles-29320290-w8n6p\" (UID: \"db316304-e02f-43df-b4e2-6cdd6ee3b7eb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-w8n6p" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.491452 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e98f7cd-47e7-4344-abc2-845b404874a4-serving-cert\") pod \"console-operator-58897d9998-zwnlf\" (UID: \"3e98f7cd-47e7-4344-abc2-845b404874a4\") " pod="openshift-console-operator/console-operator-58897d9998-zwnlf" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.492900 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6fddba3c-1ff7-42f4-99da-8f282c6095fc-metrics-tls\") pod \"ingress-operator-5b745b69d9-zpzhj\" (UID: \"6fddba3c-1ff7-42f4-99da-8f282c6095fc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zpzhj" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.496661 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpszj\" (UniqueName: \"kubernetes.io/projected/eb4bc73c-b474-4efa-b348-63ff26045c24-kube-api-access-dpszj\") pod \"router-default-5444994796-vjd5w\" (UID: \"eb4bc73c-b474-4efa-b348-63ff26045c24\") " pod="openshift-ingress/router-default-5444994796-vjd5w" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.501502 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/391a89e2-2467-4f7e-aa8d-d4c939845a67-images\") pod \"machine-config-operator-74547568cd-hgcbk\" (UID: \"391a89e2-2467-4f7e-aa8d-d4c939845a67\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hgcbk" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.501779 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd4422f1-0405-44b0-9256-fec03b6dc2f0-trusted-ca\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.501874 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-z9mdp" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.502423 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac72e4c7-2db7-411a-8f4b-28687be463f3-config\") pod \"kube-apiserver-operator-766d6c64bb-qgs9v\" (UID: \"ac72e4c7-2db7-411a-8f4b-28687be463f3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgs9v" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.503072 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/85616cb5-98c5-4296-858a-462f8ca42702-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-n5tk5\" (UID: \"85616cb5-98c5-4296-858a-462f8ca42702\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n5tk5" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.504707 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/eb4bc73c-b474-4efa-b348-63ff26045c24-stats-auth\") pod \"router-default-5444994796-vjd5w\" (UID: \"eb4bc73c-b474-4efa-b348-63ff26045c24\") " pod="openshift-ingress/router-default-5444994796-vjd5w" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.505112 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210-serving-cert\") pod \"etcd-operator-b45778765-ptzmt\" (UID: \"e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ptzmt" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.505873 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cd4422f1-0405-44b0-9256-fec03b6dc2f0-registry-tls\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.509064 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210-etcd-service-ca\") pod \"etcd-operator-b45778765-ptzmt\" (UID: \"e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ptzmt" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.518406 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/32814054-fdc5-432f-a901-43f758af1b44-metrics-tls\") pod \"dns-operator-744455d44c-7278d\" (UID: \"32814054-fdc5-432f-a901-43f758af1b44\") " pod="openshift-dns-operator/dns-operator-744455d44c-7278d" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.537114 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psh8s\" (UniqueName: \"kubernetes.io/projected/7a196114-d286-4138-8ffc-baeb5ecc02df-kube-api-access-psh8s\") pod \"packageserver-d55dfcdfc-2fn4r\" (UID: \"7a196114-d286-4138-8ffc-baeb5ecc02df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2fn4r" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.537142 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp2gs\" (UniqueName: \"kubernetes.io/projected/3b428f41-cff0-421d-a763-987d15be26eb-kube-api-access-dp2gs\") pod \"downloads-7954f5f757-jx7lt\" (UID: \"3b428f41-cff0-421d-a763-987d15be26eb\") " pod="openshift-console/downloads-7954f5f757-jx7lt" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.540634 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5lscq" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.541165 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gsddw"] Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.547818 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ac72e4c7-2db7-411a-8f4b-28687be463f3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qgs9v\" (UID: \"ac72e4c7-2db7-411a-8f4b-28687be463f3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgs9v" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.551239 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/de49f8fe-30f8-44ae-beaa-fe61cf7b0a16-mountpoint-dir\") pod \"csi-hostpathplugin-zrlmg\" (UID: \"de49f8fe-30f8-44ae-beaa-fe61cf7b0a16\") " pod="hostpath-provisioner/csi-hostpathplugin-zrlmg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.551276 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4f24c041-1e7c-4c07-8139-d8d47f9d3539-certs\") pod \"machine-config-server-wtpqg\" (UID: \"4f24c041-1e7c-4c07-8139-d8d47f9d3539\") " pod="openshift-machine-config-operator/machine-config-server-wtpqg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.551332 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6k2w\" (UniqueName: \"kubernetes.io/projected/4f24c041-1e7c-4c07-8139-d8d47f9d3539-kube-api-access-p6k2w\") pod \"machine-config-server-wtpqg\" (UID: \"4f24c041-1e7c-4c07-8139-d8d47f9d3539\") " pod="openshift-machine-config-operator/machine-config-server-wtpqg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.551363 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86c4ea2a-21a5-46e4-bfd9-28d7a429b7ae-config-volume\") pod \"dns-default-76w2l\" (UID: \"86c4ea2a-21a5-46e4-bfd9-28d7a429b7ae\") " pod="openshift-dns/dns-default-76w2l" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.551385 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.551419 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/de49f8fe-30f8-44ae-beaa-fe61cf7b0a16-registration-dir\") pod \"csi-hostpathplugin-zrlmg\" (UID: \"de49f8fe-30f8-44ae-beaa-fe61cf7b0a16\") " pod="hostpath-provisioner/csi-hostpathplugin-zrlmg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.551438 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/de49f8fe-30f8-44ae-beaa-fe61cf7b0a16-csi-data-dir\") pod \"csi-hostpathplugin-zrlmg\" (UID: \"de49f8fe-30f8-44ae-beaa-fe61cf7b0a16\") " pod="hostpath-provisioner/csi-hostpathplugin-zrlmg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.551461 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86c4ea2a-21a5-46e4-bfd9-28d7a429b7ae-metrics-tls\") pod \"dns-default-76w2l\" (UID: \"86c4ea2a-21a5-46e4-bfd9-28d7a429b7ae\") " pod="openshift-dns/dns-default-76w2l" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.551489 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/de49f8fe-30f8-44ae-beaa-fe61cf7b0a16-socket-dir\") pod \"csi-hostpathplugin-zrlmg\" (UID: \"de49f8fe-30f8-44ae-beaa-fe61cf7b0a16\") " pod="hostpath-provisioner/csi-hostpathplugin-zrlmg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.551513 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4f24c041-1e7c-4c07-8139-d8d47f9d3539-node-bootstrap-token\") pod \"machine-config-server-wtpqg\" (UID: \"4f24c041-1e7c-4c07-8139-d8d47f9d3539\") " pod="openshift-machine-config-operator/machine-config-server-wtpqg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.551542 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsr5t\" (UniqueName: \"kubernetes.io/projected/de49f8fe-30f8-44ae-beaa-fe61cf7b0a16-kube-api-access-vsr5t\") pod \"csi-hostpathplugin-zrlmg\" (UID: \"de49f8fe-30f8-44ae-beaa-fe61cf7b0a16\") " pod="hostpath-provisioner/csi-hostpathplugin-zrlmg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.551586 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lwx6\" (UniqueName: \"kubernetes.io/projected/86c4ea2a-21a5-46e4-bfd9-28d7a429b7ae-kube-api-access-9lwx6\") pod \"dns-default-76w2l\" (UID: \"86c4ea2a-21a5-46e4-bfd9-28d7a429b7ae\") " pod="openshift-dns/dns-default-76w2l" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.551591 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/de49f8fe-30f8-44ae-beaa-fe61cf7b0a16-mountpoint-dir\") pod \"csi-hostpathplugin-zrlmg\" (UID: \"de49f8fe-30f8-44ae-beaa-fe61cf7b0a16\") " pod="hostpath-provisioner/csi-hostpathplugin-zrlmg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.551617 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/de49f8fe-30f8-44ae-beaa-fe61cf7b0a16-plugins-dir\") pod \"csi-hostpathplugin-zrlmg\" (UID: \"de49f8fe-30f8-44ae-beaa-fe61cf7b0a16\") " pod="hostpath-provisioner/csi-hostpathplugin-zrlmg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.551791 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/de49f8fe-30f8-44ae-beaa-fe61cf7b0a16-plugins-dir\") pod \"csi-hostpathplugin-zrlmg\" (UID: \"de49f8fe-30f8-44ae-beaa-fe61cf7b0a16\") " pod="hostpath-provisioner/csi-hostpathplugin-zrlmg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.551845 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/de49f8fe-30f8-44ae-beaa-fe61cf7b0a16-socket-dir\") pod \"csi-hostpathplugin-zrlmg\" (UID: \"de49f8fe-30f8-44ae-beaa-fe61cf7b0a16\") " pod="hostpath-provisioner/csi-hostpathplugin-zrlmg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.551868 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/de49f8fe-30f8-44ae-beaa-fe61cf7b0a16-registration-dir\") pod \"csi-hostpathplugin-zrlmg\" (UID: \"de49f8fe-30f8-44ae-beaa-fe61cf7b0a16\") " pod="hostpath-provisioner/csi-hostpathplugin-zrlmg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.552604 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/de49f8fe-30f8-44ae-beaa-fe61cf7b0a16-csi-data-dir\") pod \"csi-hostpathplugin-zrlmg\" (UID: \"de49f8fe-30f8-44ae-beaa-fe61cf7b0a16\") " pod="hostpath-provisioner/csi-hostpathplugin-zrlmg" Sep 30 07:35:57 crc kubenswrapper[4760]: E0930 07:35:57.552906 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:35:58.052890866 +0000 UTC m=+143.695797278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.553200 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86c4ea2a-21a5-46e4-bfd9-28d7a429b7ae-config-volume\") pod \"dns-default-76w2l\" (UID: \"86c4ea2a-21a5-46e4-bfd9-28d7a429b7ae\") " pod="openshift-dns/dns-default-76w2l" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.554632 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mqkwn" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.557147 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4f24c041-1e7c-4c07-8139-d8d47f9d3539-certs\") pod \"machine-config-server-wtpqg\" (UID: \"4f24c041-1e7c-4c07-8139-d8d47f9d3539\") " pod="openshift-machine-config-operator/machine-config-server-wtpqg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.560397 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4f24c041-1e7c-4c07-8139-d8d47f9d3539-node-bootstrap-token\") pod \"machine-config-server-wtpqg\" (UID: \"4f24c041-1e7c-4c07-8139-d8d47f9d3539\") " pod="openshift-machine-config-operator/machine-config-server-wtpqg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.563267 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2fn4r" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.568957 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86c4ea2a-21a5-46e4-bfd9-28d7a429b7ae-metrics-tls\") pod \"dns-default-76w2l\" (UID: \"86c4ea2a-21a5-46e4-bfd9-28d7a429b7ae\") " pod="openshift-dns/dns-default-76w2l" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.585757 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-vjd5w" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.588155 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n729\" (UniqueName: \"kubernetes.io/projected/3e98f7cd-47e7-4344-abc2-845b404874a4-kube-api-access-6n729\") pod \"console-operator-58897d9998-zwnlf\" (UID: \"3e98f7cd-47e7-4344-abc2-845b404874a4\") " pod="openshift-console-operator/console-operator-58897d9998-zwnlf" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.611334 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2frr\" (UniqueName: \"kubernetes.io/projected/6fddba3c-1ff7-42f4-99da-8f282c6095fc-kube-api-access-j2frr\") pod \"ingress-operator-5b745b69d9-zpzhj\" (UID: \"6fddba3c-1ff7-42f4-99da-8f282c6095fc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zpzhj" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.614411 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q62t\" (UniqueName: \"kubernetes.io/projected/e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210-kube-api-access-9q62t\") pod \"etcd-operator-b45778765-ptzmt\" (UID: \"e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ptzmt" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.640262 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-75sr2"] Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.650629 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnrzj\" (UniqueName: \"kubernetes.io/projected/32814054-fdc5-432f-a901-43f758af1b44-kube-api-access-hnrzj\") pod \"dns-operator-744455d44c-7278d\" (UID: \"32814054-fdc5-432f-a901-43f758af1b44\") " pod="openshift-dns-operator/dns-operator-744455d44c-7278d" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.653156 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:35:57 crc kubenswrapper[4760]: E0930 07:35:57.653619 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:35:58.153603874 +0000 UTC m=+143.796510286 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.655374 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tlrrq"] Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.669469 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzx7d\" (UniqueName: \"kubernetes.io/projected/23c1276d-8d7b-4ffa-9290-3bd09756c660-kube-api-access-fzx7d\") pod \"machine-config-controller-84d6567774-kkfmw\" (UID: \"23c1276d-8d7b-4ffa-9290-3bd09756c660\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kkfmw" Sep 30 07:35:57 crc kubenswrapper[4760]: W0930 07:35:57.683057 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0f0f394_f395_4181_a0f9_afa9b7467013.slice/crio-a76059eb2141791659a33abd2e7d34c89dc2050cfdaddc79f88e737126252c3f WatchSource:0}: Error finding container a76059eb2141791659a33abd2e7d34c89dc2050cfdaddc79f88e737126252c3f: Status 404 returned error can't find the container with id a76059eb2141791659a33abd2e7d34c89dc2050cfdaddc79f88e737126252c3f Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.685209 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4vvx\" (UniqueName: \"kubernetes.io/projected/db316304-e02f-43df-b4e2-6cdd6ee3b7eb-kube-api-access-s4vvx\") pod \"collect-profiles-29320290-w8n6p\" (UID: \"db316304-e02f-43df-b4e2-6cdd6ee3b7eb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-w8n6p" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.691878 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6fddba3c-1ff7-42f4-99da-8f282c6095fc-bound-sa-token\") pod \"ingress-operator-5b745b69d9-zpzhj\" (UID: \"6fddba3c-1ff7-42f4-99da-8f282c6095fc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zpzhj" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.709267 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd4422f1-0405-44b0-9256-fec03b6dc2f0-bound-sa-token\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.730721 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq72b\" (UniqueName: \"kubernetes.io/projected/d2195fee-f00a-408c-a44d-c74b59078ad7-kube-api-access-sq72b\") pod \"ingress-canary-82mws\" (UID: \"d2195fee-f00a-408c-a44d-c74b59078ad7\") " pod="openshift-ingress-canary/ingress-canary-82mws" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.753244 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-7278d" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.754295 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:57 crc kubenswrapper[4760]: E0930 07:35:57.754625 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:35:58.254613771 +0000 UTC m=+143.897520183 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.758097 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gpn2\" (UniqueName: \"kubernetes.io/projected/9873ee29-2db0-462b-8c6c-6efc009193fa-kube-api-access-5gpn2\") pod \"package-server-manager-789f6589d5-9gnwv\" (UID: \"9873ee29-2db0-462b-8c6c-6efc009193fa\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9gnwv" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.760450 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgs9v" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.771626 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-ptzmt" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.782926 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8qh7\" (UniqueName: \"kubernetes.io/projected/85616cb5-98c5-4296-858a-462f8ca42702-kube-api-access-k8qh7\") pod \"multus-admission-controller-857f4d67dd-n5tk5\" (UID: \"85616cb5-98c5-4296-858a-462f8ca42702\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n5tk5" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.786594 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqlvh\" (UniqueName: \"kubernetes.io/projected/391a89e2-2467-4f7e-aa8d-d4c939845a67-kube-api-access-rqlvh\") pod \"machine-config-operator-74547568cd-hgcbk\" (UID: \"391a89e2-2467-4f7e-aa8d-d4c939845a67\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hgcbk" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.800858 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mncn2"] Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.802889 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pznb4"] Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.806040 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-jx7lt" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.807058 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r4xr\" (UniqueName: \"kubernetes.io/projected/cd4422f1-0405-44b0-9256-fec03b6dc2f0-kube-api-access-5r4xr\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.811540 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kkfmw" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.820636 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-zwnlf" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.825059 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zpzhj" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.831910 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-n5tk5" Sep 30 07:35:57 crc kubenswrapper[4760]: W0930 07:35:57.840580 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85f89a8e_5f37_458f_9896_fe3940cc68b6.slice/crio-0004d30d2aec27f811358ab23769c0155ef31ae4cfcb0205de7cac265b296ff2 WatchSource:0}: Error finding container 0004d30d2aec27f811358ab23769c0155ef31ae4cfcb0205de7cac265b296ff2: Status 404 returned error can't find the container with id 0004d30d2aec27f811358ab23769c0155ef31ae4cfcb0205de7cac265b296ff2 Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.855613 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:35:57 crc kubenswrapper[4760]: E0930 07:35:57.856599 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:35:58.356560964 +0000 UTC m=+143.999467386 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.859206 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:57 crc kubenswrapper[4760]: E0930 07:35:57.859751 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:35:58.359732915 +0000 UTC m=+144.002639327 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.862727 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmqhx"] Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.871612 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6sg7\" (UniqueName: \"kubernetes.io/projected/84902f21-f09f-4c6b-b92f-1569d4aa2fcc-kube-api-access-c6sg7\") pod \"catalog-operator-68c6474976-pn92b\" (UID: \"84902f21-f09f-4c6b-b92f-1569d4aa2fcc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pn92b" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.872785 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-w8n6p" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.877828 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9gnwv" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.889520 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6k2w\" (UniqueName: \"kubernetes.io/projected/4f24c041-1e7c-4c07-8139-d8d47f9d3539-kube-api-access-p6k2w\") pod \"machine-config-server-wtpqg\" (UID: \"4f24c041-1e7c-4c07-8139-d8d47f9d3539\") " pod="openshift-machine-config-operator/machine-config-server-wtpqg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.903053 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-82mws" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.904147 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsr5t\" (UniqueName: \"kubernetes.io/projected/de49f8fe-30f8-44ae-beaa-fe61cf7b0a16-kube-api-access-vsr5t\") pod \"csi-hostpathplugin-zrlmg\" (UID: \"de49f8fe-30f8-44ae-beaa-fe61cf7b0a16\") " pod="hostpath-provisioner/csi-hostpathplugin-zrlmg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.928868 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-zrlmg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.938598 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-wtpqg" Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.961536 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:35:57 crc kubenswrapper[4760]: E0930 07:35:57.961943 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:35:58.461922855 +0000 UTC m=+144.104829267 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:57 crc kubenswrapper[4760]: I0930 07:35:57.964904 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lwx6\" (UniqueName: \"kubernetes.io/projected/86c4ea2a-21a5-46e4-bfd9-28d7a429b7ae-kube-api-access-9lwx6\") pod \"dns-default-76w2l\" (UID: \"86c4ea2a-21a5-46e4-bfd9-28d7a429b7ae\") " pod="openshift-dns/dns-default-76w2l" Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.039228 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-s7mth"] Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.053295 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tlrrq" event={"ID":"a96a9516-5f80-4391-a1f2-f4b7531e65fa","Type":"ContainerStarted","Data":"a19b4e0416036ca48acbe2a6488435f85a42c3c5fe0850e3a3741162bdfa8310"} Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.063387 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:58 crc kubenswrapper[4760]: E0930 07:35:58.063732 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:35:58.563721073 +0000 UTC m=+144.206627475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.069886 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" event={"ID":"7c05864f-63f6-4fdf-9207-0d63dd89fc49","Type":"ContainerStarted","Data":"11282a731b2d92e12331c441b19f0c900f0d81b4148ba79b8eb94c438a612600"} Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.071249 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-vjd5w" event={"ID":"eb4bc73c-b474-4efa-b348-63ff26045c24","Type":"ContainerStarted","Data":"23087104eeb4692e30df205e58ee45c8a7ec7542c73c724b01d524da83729dab"} Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.078375 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmqhx" event={"ID":"09aeaa85-1a1d-426f-b7f2-611f67942f2c","Type":"ContainerStarted","Data":"d307a616a72c18a9fbfa07bbc231fa0db56778f0c8ad62530992e63734d59077"} Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.082088 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hgcbk" Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.082144 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gsddw" event={"ID":"cc4f6325-5a2a-4f15-8ee9-860617a9d7ce","Type":"ContainerStarted","Data":"52bb5880b1ba1e2a699223268f5ecb0f555d929cf119d904d6704135df89b0ae"} Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.082171 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gsddw" event={"ID":"cc4f6325-5a2a-4f15-8ee9-860617a9d7ce","Type":"ContainerStarted","Data":"5d2e908fa0e01a3196cb3916e635d6049df1ba9e495bb303a23317e343e83e63"} Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.083055 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gsddw" Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.089635 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pn92b" Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.097390 4760 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gsddw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.097479 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gsddw" podUID="cc4f6325-5a2a-4f15-8ee9-860617a9d7ce" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.097718 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5lscq"] Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.099930 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2fn4r"] Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.099969 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-z9mdp"] Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.103662 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mncn2" event={"ID":"85f89a8e-5f37-458f-9896-fe3940cc68b6","Type":"ContainerStarted","Data":"0004d30d2aec27f811358ab23769c0155ef31ae4cfcb0205de7cac265b296ff2"} Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.105664 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-p96nl"] Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.113499 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pznb4" event={"ID":"d8e4c5c1-b6db-4baf-8bbe-9fc39e1ed6ff","Type":"ContainerStarted","Data":"761cd9da92c2452b54b40fe0c4f99a8219b01e611a395a592d4cf34429b87375"} Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.115959 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" event={"ID":"179c4bc2-b28d-445b-98f2-aa307d57cd9f","Type":"ContainerStarted","Data":"0126c71425acbc760813319421fb58323df1330d624b82d81ead54f891ba2bad"} Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.116001 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" event={"ID":"179c4bc2-b28d-445b-98f2-aa307d57cd9f","Type":"ContainerStarted","Data":"17c8f777049bd05389bf1f2c8d0a8643feba035af3a53e21ba58c8ddd4ab9bce"} Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.117407 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-75sr2" event={"ID":"d0f0f394-f395-4181-a0f9-afa9b7467013","Type":"ContainerStarted","Data":"a76059eb2141791659a33abd2e7d34c89dc2050cfdaddc79f88e737126252c3f"} Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.134630 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.136482 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-mqkwn"] Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.172212 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:35:58 crc kubenswrapper[4760]: E0930 07:35:58.173284 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:35:58.673261064 +0000 UTC m=+144.316167476 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.201910 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cfnkh"] Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.232248 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-76w2l" Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.259404 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ntgr2"] Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.274118 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:58 crc kubenswrapper[4760]: E0930 07:35:58.282196 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:35:58.782182747 +0000 UTC m=+144.425089159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.298659 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fwcqc"] Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.330919 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" podStartSLOduration=123.330899664 podStartE2EDuration="2m3.330899664s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:58.329194295 +0000 UTC m=+143.972100707" watchObservedRunningTime="2025-09-30 07:35:58.330899664 +0000 UTC m=+143.973806076" Sep 30 07:35:58 crc kubenswrapper[4760]: W0930 07:35:58.341214 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63503165_1bff_42d3_99f4_2af2d7f490ec.slice/crio-403e7b7857be9c870c1f6cb8d2f8b972047c67ac082684daa77701270a61704c WatchSource:0}: Error finding container 403e7b7857be9c870c1f6cb8d2f8b972047c67ac082684daa77701270a61704c: Status 404 returned error can't find the container with id 403e7b7857be9c870c1f6cb8d2f8b972047c67ac082684daa77701270a61704c Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.397471 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:35:58 crc kubenswrapper[4760]: E0930 07:35:58.397835 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:35:58.897792802 +0000 UTC m=+144.540699214 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.398218 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:58 crc kubenswrapper[4760]: E0930 07:35:58.399733 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:35:58.899718247 +0000 UTC m=+144.542624659 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:58 crc kubenswrapper[4760]: W0930 07:35:58.404090 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c51dda7_332e_497f_96ed_932d5349ee59.slice/crio-e6025642f12e93e1bea28b9136aacd7db061b955aa7d80217ef2f57803e4e943 WatchSource:0}: Error finding container e6025642f12e93e1bea28b9136aacd7db061b955aa7d80217ef2f57803e4e943: Status 404 returned error can't find the container with id e6025642f12e93e1bea28b9136aacd7db061b955aa7d80217ef2f57803e4e943 Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.415510 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-gsddw" podStartSLOduration=123.41548769 podStartE2EDuration="2m3.41548769s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:58.408090037 +0000 UTC m=+144.050996449" watchObservedRunningTime="2025-09-30 07:35:58.41548769 +0000 UTC m=+144.058394102" Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.480740 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-ptzmt"] Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.502943 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:35:58 crc kubenswrapper[4760]: E0930 07:35:58.503696 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:35:59.003674758 +0000 UTC m=+144.646581170 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.503822 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:58 crc kubenswrapper[4760]: E0930 07:35:58.504355 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:35:59.004337897 +0000 UTC m=+144.647244309 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.587188 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-q5spr" podStartSLOduration=124.587164942 podStartE2EDuration="2m4.587164942s" podCreationTimestamp="2025-09-30 07:33:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:58.586840663 +0000 UTC m=+144.229747075" watchObservedRunningTime="2025-09-30 07:35:58.587164942 +0000 UTC m=+144.230071364" Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.607878 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:35:58 crc kubenswrapper[4760]: E0930 07:35:58.608246 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:35:59.108228226 +0000 UTC m=+144.751134648 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.669317 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" podStartSLOduration=124.669284077 podStartE2EDuration="2m4.669284077s" podCreationTimestamp="2025-09-30 07:33:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:58.636874178 +0000 UTC m=+144.279780590" watchObservedRunningTime="2025-09-30 07:35:58.669284077 +0000 UTC m=+144.312190489" Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.718023 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:58 crc kubenswrapper[4760]: E0930 07:35:58.718353 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:35:59.218342244 +0000 UTC m=+144.861248656 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.790642 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" podStartSLOduration=124.790608756 podStartE2EDuration="2m4.790608756s" podCreationTimestamp="2025-09-30 07:33:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:58.790571645 +0000 UTC m=+144.433478057" watchObservedRunningTime="2025-09-30 07:35:58.790608756 +0000 UTC m=+144.433515168" Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.819070 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:35:58 crc kubenswrapper[4760]: E0930 07:35:58.819602 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:35:59.319583527 +0000 UTC m=+144.962489929 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:58 crc kubenswrapper[4760]: W0930 07:35:58.825422 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode57ef7b7_e3ef_4d7b_8ad0_2c5015fa2210.slice/crio-b60f39bde564914f811619ff960cf6f1c51e98dab07867b5fa63ab233932ec66 WatchSource:0}: Error finding container b60f39bde564914f811619ff960cf6f1c51e98dab07867b5fa63ab233932ec66: Status 404 returned error can't find the container with id b60f39bde564914f811619ff960cf6f1c51e98dab07867b5fa63ab233932ec66 Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.922543 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:58 crc kubenswrapper[4760]: E0930 07:35:58.922970 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:35:59.42294514 +0000 UTC m=+145.065851552 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:58 crc kubenswrapper[4760]: I0930 07:35:58.987070 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ntj6x" podStartSLOduration=124.987054279 podStartE2EDuration="2m4.987054279s" podCreationTimestamp="2025-09-30 07:33:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:58.985798283 +0000 UTC m=+144.628704695" watchObservedRunningTime="2025-09-30 07:35:58.987054279 +0000 UTC m=+144.629960691" Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.024490 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:35:59 crc kubenswrapper[4760]: E0930 07:35:59.024831 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:35:59.524801351 +0000 UTC m=+145.167707763 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.025009 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:59 crc kubenswrapper[4760]: E0930 07:35:59.025445 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:35:59.525429549 +0000 UTC m=+145.168335951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.086243 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" podStartSLOduration=124.086216382 podStartE2EDuration="2m4.086216382s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:59.084242385 +0000 UTC m=+144.727148797" watchObservedRunningTime="2025-09-30 07:35:59.086216382 +0000 UTC m=+144.729122794" Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.127042 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:35:59 crc kubenswrapper[4760]: E0930 07:35:59.127419 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:35:59.627405533 +0000 UTC m=+145.270311945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.198784 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mqkwn" event={"ID":"63503165-1bff-42d3-99f4-2af2d7f490ec","Type":"ContainerStarted","Data":"403e7b7857be9c870c1f6cb8d2f8b972047c67ac082684daa77701270a61704c"} Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.230393 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:59 crc kubenswrapper[4760]: E0930 07:35:59.230693 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:35:59.730681844 +0000 UTC m=+145.373588256 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.231767 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-z9mdp" event={"ID":"ea1dde7c-e014-4b78-b8e9-0fc5a0e161ac","Type":"ContainerStarted","Data":"74c2fef913f2a8479c04e6665bdc4228f7fa465077934293982679927860466d"} Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.231827 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-z9mdp" event={"ID":"ea1dde7c-e014-4b78-b8e9-0fc5a0e161ac","Type":"ContainerStarted","Data":"a9b3eb41131d7702a7a68c51bc3ef509305ba03be697e5ac09b093b271b0edd0"} Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.327066 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tlrrq" event={"ID":"a96a9516-5f80-4391-a1f2-f4b7531e65fa","Type":"ContainerStarted","Data":"58c1ab6c12aff1f44bc35a0ca73d1e5aa861c45d2b88d361da83082fe10f23d4"} Sep 30 07:35:59 crc kubenswrapper[4760]: E0930 07:35:59.331852 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:35:59.831820535 +0000 UTC m=+145.474726937 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.340601 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ntgr2" event={"ID":"08d362f3-5c04-45fe-9981-ada11b028f83","Type":"ContainerStarted","Data":"c70666f53dc49292dc221d6fc7071042cbe8f4b5eab3d617f6714de809f462ae"} Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.358932 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-ptzmt" event={"ID":"e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210","Type":"ContainerStarted","Data":"b60f39bde564914f811619ff960cf6f1c51e98dab07867b5fa63ab233932ec66"} Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.359630 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.360026 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:59 crc kubenswrapper[4760]: E0930 07:35:59.361318 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:35:59.861280709 +0000 UTC m=+145.504187121 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.390430 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-75sr2" event={"ID":"d0f0f394-f395-4181-a0f9-afa9b7467013","Type":"ContainerStarted","Data":"113f7de711ce9dfdfdc66d414b95f2db585334f0cba51e17a5242a33bbda1407"} Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.424751 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kkfmw"] Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.457630 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pznb4" event={"ID":"d8e4c5c1-b6db-4baf-8bbe-9fc39e1ed6ff","Type":"ContainerStarted","Data":"d31ae7d79cce40090c248b749913376845c29ce770e14625851849024a991297"} Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.461195 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:35:59 crc kubenswrapper[4760]: E0930 07:35:59.462878 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:35:59.962860692 +0000 UTC m=+145.605767104 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.478726 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fwcqc" event={"ID":"4c51dda7-332e-497f-96ed-932d5349ee59","Type":"ContainerStarted","Data":"e6025642f12e93e1bea28b9136aacd7db061b955aa7d80217ef2f57803e4e943"} Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.481362 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-zpzhj"] Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.498270 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-4lv9x" podStartSLOduration=124.498255807 podStartE2EDuration="2m4.498255807s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:59.496082115 +0000 UTC m=+145.138988537" watchObservedRunningTime="2025-09-30 07:35:59.498255807 +0000 UTC m=+145.141162219" Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.498486 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2fn4r" event={"ID":"7a196114-d286-4138-8ffc-baeb5ecc02df","Type":"ContainerStarted","Data":"46bbd5e3ec4bbdffd583a1a45c1486af1367d62ac542bd856c23f022bc41b161"} Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.498806 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2fn4r" Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.508258 4760 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-2fn4r container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" start-of-body= Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.508317 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2fn4r" podUID="7a196114-d286-4138-8ffc-baeb5ecc02df" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.531840 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p96nl" event={"ID":"2611e16f-2c0b-44b4-929b-21f16b1b2e4d","Type":"ContainerStarted","Data":"c37dab05a1d360185191f97791ce236ef09a6ff4f075baff601404b1d2a169a8"} Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.531902 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p96nl" event={"ID":"2611e16f-2c0b-44b4-929b-21f16b1b2e4d","Type":"ContainerStarted","Data":"e01b103911e01ccf50dab6257c80d0318fda686d00f50b2bb28d110a82b30883"} Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.545606 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-vjd5w" event={"ID":"eb4bc73c-b474-4efa-b348-63ff26045c24","Type":"ContainerStarted","Data":"62b7af4f0282ba4efc2100bc3ad9619cfdc9675a91472e56c5deda7faccc2981"} Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.547233 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s7mth" event={"ID":"dd4856c0-fc17-49b2-b37b-b0414e6a2f48","Type":"ContainerStarted","Data":"a03655233f9d1a19e3d98b39e6ca7e1f2c0edd85c21a40cc776b58d2df5cd9d3"} Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.547942 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cfnkh" event={"ID":"86638afb-4930-4496-a00d-8f243be3ab33","Type":"ContainerStarted","Data":"8801e5ed9a40ad2b84fd39b654f039c2dbb7e144bf43539c1731d9d6546f5209"} Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.556605 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5lscq" event={"ID":"26bc4724-af08-4012-9656-d1cd06b533ef","Type":"ContainerStarted","Data":"28038db76ea07d16e1fcfe0b4650f5e7d8a01fd4a5bff13a0b499d516f1ab2da"} Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.559152 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-wtpqg" event={"ID":"4f24c041-1e7c-4c07-8139-d8d47f9d3539","Type":"ContainerStarted","Data":"c77e03de8b3416b7d8cea3adaa200120866ceb358998f138051b5e521a883e29"} Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.559475 4760 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gsddw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.559509 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gsddw" podUID="cc4f6325-5a2a-4f15-8ee9-860617a9d7ce" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.562389 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:59 crc kubenswrapper[4760]: E0930 07:35:59.564039 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:36:00.064026523 +0000 UTC m=+145.706932935 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.572552 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7" podStartSLOduration=124.572522916 podStartE2EDuration="2m4.572522916s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:59.535555986 +0000 UTC m=+145.178462398" watchObservedRunningTime="2025-09-30 07:35:59.572522916 +0000 UTC m=+145.215429328" Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.579573 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hgcbk"] Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.579873 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgs9v"] Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.588839 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-vjd5w" Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.595992 4760 patch_prober.go:28] interesting pod/router-default-5444994796-vjd5w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 07:35:59 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Sep 30 07:35:59 crc kubenswrapper[4760]: [+]process-running ok Sep 30 07:35:59 crc kubenswrapper[4760]: healthz check failed Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.596033 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vjd5w" podUID="eb4bc73c-b474-4efa-b348-63ff26045c24" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 07:35:59 crc kubenswrapper[4760]: W0930 07:35:59.660057 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac72e4c7_2db7_411a_8f4b_28687be463f3.slice/crio-f88e91a59837631e74958687fe12c7fce9bbfb7b801e553db44fd17f510e7c0e WatchSource:0}: Error finding container f88e91a59837631e74958687fe12c7fce9bbfb7b801e553db44fd17f510e7c0e: Status 404 returned error can't find the container with id f88e91a59837631e74958687fe12c7fce9bbfb7b801e553db44fd17f510e7c0e Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.667531 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:35:59 crc kubenswrapper[4760]: E0930 07:35:59.670094 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:36:00.170078014 +0000 UTC m=+145.812984426 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.671060 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:59 crc kubenswrapper[4760]: E0930 07:35:59.675389 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:36:00.175376406 +0000 UTC m=+145.818282818 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.741725 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lgt64" podStartSLOduration=125.741690977 podStartE2EDuration="2m5.741690977s" podCreationTimestamp="2025-09-30 07:33:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:59.736436846 +0000 UTC m=+145.379343258" watchObservedRunningTime="2025-09-30 07:35:59.741690977 +0000 UTC m=+145.384597389" Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.773863 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:35:59 crc kubenswrapper[4760]: E0930 07:35:59.774016 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:36:00.273986543 +0000 UTC m=+145.916892955 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.774236 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:59 crc kubenswrapper[4760]: E0930 07:35:59.774602 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:36:00.27458346 +0000 UTC m=+145.917489872 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.843504 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pznb4" podStartSLOduration=124.843483596 podStartE2EDuration="2m4.843483596s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:59.828390643 +0000 UTC m=+145.471297055" watchObservedRunningTime="2025-09-30 07:35:59.843483596 +0000 UTC m=+145.486389998" Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.852362 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zwnlf"] Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.854502 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-76w2l"] Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.884637 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:35:59 crc kubenswrapper[4760]: E0930 07:35:59.885160 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:36:00.38514006 +0000 UTC m=+146.028046472 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.886706 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-z9mdp" podStartSLOduration=124.886682625 podStartE2EDuration="2m4.886682625s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:59.861623156 +0000 UTC m=+145.504529568" watchObservedRunningTime="2025-09-30 07:35:59.886682625 +0000 UTC m=+145.529589097" Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.890367 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-n5tk5"] Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.915130 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2fn4r" podStartSLOduration=124.915075899 podStartE2EDuration="2m4.915075899s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:59.908350946 +0000 UTC m=+145.551257358" watchObservedRunningTime="2025-09-30 07:35:59.915075899 +0000 UTC m=+145.557982311" Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.939005 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7278d"] Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.944282 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320290-w8n6p"] Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.945538 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-jx7lt"] Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.954253 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-82mws"] Sep 30 07:35:59 crc kubenswrapper[4760]: W0930 07:35:59.985753 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e98f7cd_47e7_4344_abc2_845b404874a4.slice/crio-825780969a8f9d1f3c523753c6eaf0bbd75872c222b9fa591422a17cfe416908 WatchSource:0}: Error finding container 825780969a8f9d1f3c523753c6eaf0bbd75872c222b9fa591422a17cfe416908: Status 404 returned error can't find the container with id 825780969a8f9d1f3c523753c6eaf0bbd75872c222b9fa591422a17cfe416908 Sep 30 07:35:59 crc kubenswrapper[4760]: I0930 07:35:59.987057 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:35:59 crc kubenswrapper[4760]: E0930 07:35:59.987403 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:36:00.487393473 +0000 UTC m=+146.130299885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.004702 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tlrrq" podStartSLOduration=125.004680728 podStartE2EDuration="2m5.004680728s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:59.960384298 +0000 UTC m=+145.603290730" watchObservedRunningTime="2025-09-30 07:36:00.004680728 +0000 UTC m=+145.647587140" Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.033055 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-vjd5w" podStartSLOduration=125.033040441 podStartE2EDuration="2m5.033040441s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:00.031744774 +0000 UTC m=+145.674651186" watchObservedRunningTime="2025-09-30 07:36:00.033040441 +0000 UTC m=+145.675946853" Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.033668 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-75sr2" podStartSLOduration=125.033663519 podStartE2EDuration="2m5.033663519s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:35:59.999604203 +0000 UTC m=+145.642510615" watchObservedRunningTime="2025-09-30 07:36:00.033663519 +0000 UTC m=+145.676569931" Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.050452 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pn92b"] Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.087569 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-zrlmg"] Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.087617 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9gnwv"] Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.088023 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:36:00 crc kubenswrapper[4760]: E0930 07:36:00.088205 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:36:00.588189483 +0000 UTC m=+146.231095895 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.088289 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:36:00 crc kubenswrapper[4760]: E0930 07:36:00.088572 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:36:00.588564113 +0000 UTC m=+146.231470535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:00 crc kubenswrapper[4760]: W0930 07:36:00.148650 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9873ee29_2db0_462b_8c6c_6efc009193fa.slice/crio-3fe466c889b42c1c5b35534ff0593bd17d2c17169056b7d6cf8edca5193d6ab0 WatchSource:0}: Error finding container 3fe466c889b42c1c5b35534ff0593bd17d2c17169056b7d6cf8edca5193d6ab0: Status 404 returned error can't find the container with id 3fe466c889b42c1c5b35534ff0593bd17d2c17169056b7d6cf8edca5193d6ab0 Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.189213 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:36:00 crc kubenswrapper[4760]: E0930 07:36:00.190009 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:36:00.689981292 +0000 UTC m=+146.332887704 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:00 crc kubenswrapper[4760]: W0930 07:36:00.201738 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde49f8fe_30f8_44ae_beaa_fe61cf7b0a16.slice/crio-e2828202c94e0cd9e05b52fd487b343b4babda14be6d236f99262ad384693992 WatchSource:0}: Error finding container e2828202c94e0cd9e05b52fd487b343b4babda14be6d236f99262ad384693992: Status 404 returned error can't find the container with id e2828202c94e0cd9e05b52fd487b343b4babda14be6d236f99262ad384693992 Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.294808 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:36:00 crc kubenswrapper[4760]: E0930 07:36:00.295200 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:36:00.795189858 +0000 UTC m=+146.438096270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.397012 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:36:00 crc kubenswrapper[4760]: E0930 07:36:00.397333 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:36:00.897311367 +0000 UTC m=+146.540217779 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.397572 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:36:00 crc kubenswrapper[4760]: E0930 07:36:00.397816 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:36:00.897809121 +0000 UTC m=+146.540715523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.499890 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:36:00 crc kubenswrapper[4760]: E0930 07:36:00.500042 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:36:01.000006921 +0000 UTC m=+146.642913333 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.500610 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:36:00 crc kubenswrapper[4760]: E0930 07:36:00.500902 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:36:01.000890297 +0000 UTC m=+146.643796709 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.592700 4760 patch_prober.go:28] interesting pod/router-default-5444994796-vjd5w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 07:36:00 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Sep 30 07:36:00 crc kubenswrapper[4760]: [+]process-running ok Sep 30 07:36:00 crc kubenswrapper[4760]: healthz check failed Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.592751 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vjd5w" podUID="eb4bc73c-b474-4efa-b348-63ff26045c24" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.601541 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:36:00 crc kubenswrapper[4760]: E0930 07:36:00.601868 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:36:01.101853582 +0000 UTC m=+146.744759994 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.603760 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pn92b" event={"ID":"84902f21-f09f-4c6b-b92f-1569d4aa2fcc","Type":"ContainerStarted","Data":"1de7dc1f61612d0ccbb9ac67a82593b09daadea6bd291c8ac84e5641366a5bb4"} Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.603803 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pn92b" event={"ID":"84902f21-f09f-4c6b-b92f-1569d4aa2fcc","Type":"ContainerStarted","Data":"fbe86de53754d8e039265aade455ac67be7f1746fe90475baed1d941f9dd379a"} Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.604223 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pn92b" Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.615412 4760 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-pn92b container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.615484 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pn92b" podUID="84902f21-f09f-4c6b-b92f-1569d4aa2fcc" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.624077 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7278d" event={"ID":"32814054-fdc5-432f-a901-43f758af1b44","Type":"ContainerStarted","Data":"686919f53d72f2ad2c72df33c23b2c48d7ebd16ba130a8589d3735852ccf9cf2"} Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.626064 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pn92b" podStartSLOduration=125.626046865 podStartE2EDuration="2m5.626046865s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:00.624727978 +0000 UTC m=+146.267634390" watchObservedRunningTime="2025-09-30 07:36:00.626046865 +0000 UTC m=+146.268953267" Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.626214 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-w8n6p" event={"ID":"db316304-e02f-43df-b4e2-6cdd6ee3b7eb","Type":"ContainerStarted","Data":"a6a4df1c779996b80c57ea79d6eadfb720c63d9facc8b5e93311cdc717996acc"} Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.627017 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-w8n6p" event={"ID":"db316304-e02f-43df-b4e2-6cdd6ee3b7eb","Type":"ContainerStarted","Data":"ece71c94ecade083af1a35bc8d5ffc8421c6053f3a8debb97bde3a68724e8cad"} Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.649695 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mncn2" event={"ID":"85f89a8e-5f37-458f-9896-fe3940cc68b6","Type":"ContainerStarted","Data":"aab4de0a85595f19a1e0878a16de5b3473a7e485f8ad042118e4a5fb2a500cf5"} Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.656545 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.658329 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.660066 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-w8n6p" podStartSLOduration=125.660056641 podStartE2EDuration="2m5.660056641s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:00.650771204 +0000 UTC m=+146.293677616" watchObservedRunningTime="2025-09-30 07:36:00.660056641 +0000 UTC m=+146.302963053" Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.674446 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mncn2" podStartSLOduration=125.674427853 podStartE2EDuration="2m5.674427853s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:00.673582978 +0000 UTC m=+146.316489390" watchObservedRunningTime="2025-09-30 07:36:00.674427853 +0000 UTC m=+146.317334265" Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.686404 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-wtpqg" event={"ID":"4f24c041-1e7c-4c07-8139-d8d47f9d3539","Type":"ContainerStarted","Data":"9c9312d7c1be118d00cffa3d4d7893a4defa9114ccf91207c3c664334f7d1a5b"} Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.714466 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:36:00 crc kubenswrapper[4760]: E0930 07:36:00.716836 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:36:01.216823408 +0000 UTC m=+146.859729820 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.726085 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmqhx" event={"ID":"09aeaa85-1a1d-426f-b7f2-611f67942f2c","Type":"ContainerStarted","Data":"b03358bb975d0e40c92850a3a8ae62cad409d4b4bb865283a72aecd75f0ce463"} Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.736463 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s7mth" event={"ID":"dd4856c0-fc17-49b2-b37b-b0414e6a2f48","Type":"ContainerStarted","Data":"df7c577cb1dea6fa9ae5819c130c4f490cd17f1f140b11a307c2c2e75376ce2a"} Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.736493 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s7mth" event={"ID":"dd4856c0-fc17-49b2-b37b-b0414e6a2f48","Type":"ContainerStarted","Data":"47b26f8c9b95d1c41cd767217250a8ee351c257fcc74e8ba80219e5c5c90ad51"} Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.744958 4760 generic.go:334] "Generic (PLEG): container finished" podID="2611e16f-2c0b-44b4-929b-21f16b1b2e4d" containerID="c37dab05a1d360185191f97791ce236ef09a6ff4f075baff601404b1d2a169a8" exitCode=0 Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.745007 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p96nl" event={"ID":"2611e16f-2c0b-44b4-929b-21f16b1b2e4d","Type":"ContainerDied","Data":"c37dab05a1d360185191f97791ce236ef09a6ff4f075baff601404b1d2a169a8"} Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.745024 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p96nl" event={"ID":"2611e16f-2c0b-44b4-929b-21f16b1b2e4d","Type":"ContainerStarted","Data":"c58229efa6403de07682bfd3afff2b101eca41c75edc50be810094063b062d85"} Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.745526 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p96nl" Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.747235 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-76w2l" event={"ID":"86c4ea2a-21a5-46e4-bfd9-28d7a429b7ae","Type":"ContainerStarted","Data":"11c3767986353418345fee2833d9cfe244020aab127f8c81e887515c5f3a9b66"} Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.754141 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.754182 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.782233 4760 patch_prober.go:28] interesting pod/apiserver-76f77b778f-bhzlk container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Sep 30 07:36:00 crc kubenswrapper[4760]: [+]log ok Sep 30 07:36:00 crc kubenswrapper[4760]: [+]etcd ok Sep 30 07:36:00 crc kubenswrapper[4760]: [+]poststarthook/start-apiserver-admission-initializer ok Sep 30 07:36:00 crc kubenswrapper[4760]: [+]poststarthook/generic-apiserver-start-informers ok Sep 30 07:36:00 crc kubenswrapper[4760]: [+]poststarthook/max-in-flight-filter ok Sep 30 07:36:00 crc kubenswrapper[4760]: [+]poststarthook/storage-object-count-tracker-hook ok Sep 30 07:36:00 crc kubenswrapper[4760]: [+]poststarthook/image.openshift.io-apiserver-caches ok Sep 30 07:36:00 crc kubenswrapper[4760]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Sep 30 07:36:00 crc kubenswrapper[4760]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Sep 30 07:36:00 crc kubenswrapper[4760]: [+]poststarthook/project.openshift.io-projectcache ok Sep 30 07:36:00 crc kubenswrapper[4760]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Sep 30 07:36:00 crc kubenswrapper[4760]: [+]poststarthook/openshift.io-startinformers ok Sep 30 07:36:00 crc kubenswrapper[4760]: [+]poststarthook/openshift.io-restmapperupdater ok Sep 30 07:36:00 crc kubenswrapper[4760]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Sep 30 07:36:00 crc kubenswrapper[4760]: livez check failed Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.782269 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" podUID="179c4bc2-b28d-445b-98f2-aa307d57cd9f" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.784173 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-wtpqg" podStartSLOduration=6.784163829 podStartE2EDuration="6.784163829s" podCreationTimestamp="2025-09-30 07:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:00.782551803 +0000 UTC m=+146.425458215" watchObservedRunningTime="2025-09-30 07:36:00.784163829 +0000 UTC m=+146.427070241" Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.788620 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mqkwn" event={"ID":"63503165-1bff-42d3-99f4-2af2d7f490ec","Type":"ContainerStarted","Data":"0b844a339d480e2ef4ec5e6693d83139903af45d4fb4ccbfd6377da0b84390e8"} Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.789114 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.805885 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s7mth" podStartSLOduration=125.805873062 podStartE2EDuration="2m5.805873062s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:00.803903825 +0000 UTC m=+146.446810237" watchObservedRunningTime="2025-09-30 07:36:00.805873062 +0000 UTC m=+146.448779474" Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.817582 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:36:00 crc kubenswrapper[4760]: E0930 07:36:00.819510 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:36:01.319489912 +0000 UTC m=+146.962396324 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.836589 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-n5tk5" event={"ID":"85616cb5-98c5-4296-858a-462f8ca42702","Type":"ContainerStarted","Data":"39dfb8a21685dc1c56cceb1dcd85b271ac8bdc479fd689f768e70d83c09c5f09"} Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.838594 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmqhx" podStartSLOduration=125.83857982 podStartE2EDuration="2m5.83857982s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:00.837291853 +0000 UTC m=+146.480198265" watchObservedRunningTime="2025-09-30 07:36:00.83857982 +0000 UTC m=+146.481486232" Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.861923 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-jx7lt" event={"ID":"3b428f41-cff0-421d-a763-987d15be26eb","Type":"ContainerStarted","Data":"8532edfeb745aceb5fb3eb0e50342a46cc716203c961bb6afcc3fa16c37e108c"} Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.862737 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-jx7lt" Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.863673 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-jx7lt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.863769 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jx7lt" podUID="3b428f41-cff0-421d-a763-987d15be26eb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.872546 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p96nl" podStartSLOduration=125.872530353 podStartE2EDuration="2m5.872530353s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:00.87171429 +0000 UTC m=+146.514620702" watchObservedRunningTime="2025-09-30 07:36:00.872530353 +0000 UTC m=+146.515436765" Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.888629 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-ptzmt" event={"ID":"e57ef7b7-e3ef-4d7b-8ad0-2c5015fa2210","Type":"ContainerStarted","Data":"83684727ec8be51078314a89112ea68142874b25e44fe3062dccd2fb0b421fb4"} Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.919473 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:36:00 crc kubenswrapper[4760]: E0930 07:36:00.920825 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:36:01.420811228 +0000 UTC m=+147.063717640 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.933953 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-jx7lt" podStartSLOduration=125.933935964 podStartE2EDuration="2m5.933935964s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:00.906457066 +0000 UTC m=+146.549363478" watchObservedRunningTime="2025-09-30 07:36:00.933935964 +0000 UTC m=+146.576842376" Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.948529 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kkfmw" event={"ID":"23c1276d-8d7b-4ffa-9290-3bd09756c660","Type":"ContainerStarted","Data":"a1a39c2877a2a99792ce7794a599f4d2bb140f935f9d763216e6c9d30a979ce8"} Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.948566 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kkfmw" event={"ID":"23c1276d-8d7b-4ffa-9290-3bd09756c660","Type":"ContainerStarted","Data":"37ebe28d54a53b3383866b03e668ed5faf0bbadf16d9280f49befa5d9413ba5b"} Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.986962 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cfnkh" event={"ID":"86638afb-4930-4496-a00d-8f243be3ab33","Type":"ContainerStarted","Data":"11d0b11460a804bbfb77ae485ce7868249881d722f030fb89d1b1186fc528bfe"} Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.987345 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cfnkh" event={"ID":"86638afb-4930-4496-a00d-8f243be3ab33","Type":"ContainerStarted","Data":"ba2eec757ef752f87aceb0326334911a83a960e2fa3a43c4e9706afb1a8b6b69"} Sep 30 07:36:00 crc kubenswrapper[4760]: I0930 07:36:00.990727 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mqkwn" podStartSLOduration=125.990712072 podStartE2EDuration="2m5.990712072s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:00.936287101 +0000 UTC m=+146.579193513" watchObservedRunningTime="2025-09-30 07:36:00.990712072 +0000 UTC m=+146.633618474" Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.020915 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zwnlf" event={"ID":"3e98f7cd-47e7-4344-abc2-845b404874a4","Type":"ContainerStarted","Data":"63ed4f34b6ae0e3e4aaed1904dd33fe20778421518fe61dda87aa72cead3f312"} Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.020956 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zwnlf" event={"ID":"3e98f7cd-47e7-4344-abc2-845b404874a4","Type":"ContainerStarted","Data":"825780969a8f9d1f3c523753c6eaf0bbd75872c222b9fa591422a17cfe416908"} Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.021739 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-zwnlf" Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.022798 4760 patch_prober.go:28] interesting pod/console-operator-58897d9998-zwnlf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/readyz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.022826 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zwnlf" podUID="3e98f7cd-47e7-4344-abc2-845b404874a4" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/readyz\": dial tcp 10.217.0.40:8443: connect: connection refused" Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.024994 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:36:01 crc kubenswrapper[4760]: E0930 07:36:01.025985 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:36:01.525970213 +0000 UTC m=+147.168876625 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.055454 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kkfmw" podStartSLOduration=126.055437458 podStartE2EDuration="2m6.055437458s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:01.025674734 +0000 UTC m=+146.668581146" watchObservedRunningTime="2025-09-30 07:36:01.055437458 +0000 UTC m=+146.698343870" Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.056230 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-ptzmt" podStartSLOduration=126.05622614 podStartE2EDuration="2m6.05622614s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:01.053860493 +0000 UTC m=+146.696766905" watchObservedRunningTime="2025-09-30 07:36:01.05622614 +0000 UTC m=+146.699132552" Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.059384 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hgcbk" event={"ID":"391a89e2-2467-4f7e-aa8d-d4c939845a67","Type":"ContainerStarted","Data":"5cce05fe780e3b80746cff6334d318cbb2f538b639798f4e3f8161387fae8ad3"} Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.059426 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hgcbk" event={"ID":"391a89e2-2467-4f7e-aa8d-d4c939845a67","Type":"ContainerStarted","Data":"104489e21a9a45ec615568d69377a84e1c59132698bfc3c2215dc7e88ff10cb3"} Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.063894 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zpzhj" event={"ID":"6fddba3c-1ff7-42f4-99da-8f282c6095fc","Type":"ContainerStarted","Data":"9893d2861387d19a86a072ca1112f54fd1e50de02a23d209d407eae9a4b3dc57"} Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.063937 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zpzhj" event={"ID":"6fddba3c-1ff7-42f4-99da-8f282c6095fc","Type":"ContainerStarted","Data":"f8bfe01923fe8497d18cb0db4543594f836098590bb499724f7a1d2f8dda7dee"} Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.065127 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-zrlmg" event={"ID":"de49f8fe-30f8-44ae-beaa-fe61cf7b0a16","Type":"ContainerStarted","Data":"e2828202c94e0cd9e05b52fd487b343b4babda14be6d236f99262ad384693992"} Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.090161 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cfnkh" podStartSLOduration=126.090145483 podStartE2EDuration="2m6.090145483s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:01.068648037 +0000 UTC m=+146.711554449" watchObservedRunningTime="2025-09-30 07:36:01.090145483 +0000 UTC m=+146.733051895" Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.090678 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zpzhj" podStartSLOduration=126.090673748 podStartE2EDuration="2m6.090673748s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:01.088734753 +0000 UTC m=+146.731641155" watchObservedRunningTime="2025-09-30 07:36:01.090673748 +0000 UTC m=+146.733580160" Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.091933 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5lscq" event={"ID":"26bc4724-af08-4012-9656-d1cd06b533ef","Type":"ContainerStarted","Data":"dfcd9b46a6e96e7d5eb3e9ed80dbba45643303d7d84fc6a72ad280448d8f93c8"} Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.096600 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5lscq" Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.096840 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9gnwv" event={"ID":"9873ee29-2db0-462b-8c6c-6efc009193fa","Type":"ContainerStarted","Data":"3fe466c889b42c1c5b35534ff0593bd17d2c17169056b7d6cf8edca5193d6ab0"} Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.096921 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9gnwv" Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.104680 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5lscq" Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.105866 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ntgr2" event={"ID":"08d362f3-5c04-45fe-9981-ada11b028f83","Type":"ContainerStarted","Data":"e449778f7af6943855cd2606785aa2721d60c5eee2bb35b3fc070fa9165cc05c"} Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.137598 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgs9v" event={"ID":"ac72e4c7-2db7-411a-8f4b-28687be463f3","Type":"ContainerStarted","Data":"c52172262f4c646bb0b66b9ea73a19ff3882a7a8d38c0dcb129d6ed916c8083f"} Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.138640 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgs9v" event={"ID":"ac72e4c7-2db7-411a-8f4b-28687be463f3","Type":"ContainerStarted","Data":"f88e91a59837631e74958687fe12c7fce9bbfb7b801e553db44fd17f510e7c0e"} Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.139084 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:36:01 crc kubenswrapper[4760]: E0930 07:36:01.141099 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:36:01.641088394 +0000 UTC m=+147.283994806 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.163217 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hgcbk" podStartSLOduration=126.163199908 podStartE2EDuration="2m6.163199908s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:01.162743985 +0000 UTC m=+146.805650407" watchObservedRunningTime="2025-09-30 07:36:01.163199908 +0000 UTC m=+146.806106320" Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.166070 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-zwnlf" podStartSLOduration=126.1660586 podStartE2EDuration="2m6.1660586s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:01.13853031 +0000 UTC m=+146.781436722" watchObservedRunningTime="2025-09-30 07:36:01.1660586 +0000 UTC m=+146.808965012" Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.178638 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-82mws" event={"ID":"d2195fee-f00a-408c-a44d-c74b59078ad7","Type":"ContainerStarted","Data":"2356e53149a309f2273d935acaf5d390c8ffe5d7c38070332f53fba2c577e886"} Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.181842 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2fn4r" event={"ID":"7a196114-d286-4138-8ffc-baeb5ecc02df","Type":"ContainerStarted","Data":"2b38ed3146bf6a8374429bcf32f4d84e3cbf92391084b5394a98be33f0e11c1f"} Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.190753 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9gnwv" podStartSLOduration=126.190738387 podStartE2EDuration="2m6.190738387s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:01.190715667 +0000 UTC m=+146.833622079" watchObservedRunningTime="2025-09-30 07:36:01.190738387 +0000 UTC m=+146.833644799" Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.198109 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fwcqc" event={"ID":"4c51dda7-332e-497f-96ed-932d5349ee59","Type":"ContainerStarted","Data":"5fbe4895f819261d4a837320191008ea4a817587914caf662b5a572bf04ca571"} Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.208750 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gsddw" Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.215677 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qc22x" Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.239063 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgs9v" podStartSLOduration=126.239049063 podStartE2EDuration="2m6.239049063s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:01.217586797 +0000 UTC m=+146.860493209" watchObservedRunningTime="2025-09-30 07:36:01.239049063 +0000 UTC m=+146.881955475" Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.239556 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5lscq" podStartSLOduration=126.239552737 podStartE2EDuration="2m6.239552737s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:01.238832877 +0000 UTC m=+146.881739299" watchObservedRunningTime="2025-09-30 07:36:01.239552737 +0000 UTC m=+146.882459149" Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.247042 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:36:01 crc kubenswrapper[4760]: E0930 07:36:01.247478 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:36:01.747455544 +0000 UTC m=+147.390361956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.249611 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:36:01 crc kubenswrapper[4760]: E0930 07:36:01.251467 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:36:01.751443728 +0000 UTC m=+147.394350140 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.321136 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fwcqc" podStartSLOduration=126.321121246 podStartE2EDuration="2m6.321121246s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:01.319951402 +0000 UTC m=+146.962857814" watchObservedRunningTime="2025-09-30 07:36:01.321121246 +0000 UTC m=+146.964027658" Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.324495 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-ntgr2" podStartSLOduration=126.324464162 podStartE2EDuration="2m6.324464162s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:01.277803974 +0000 UTC m=+146.920710386" watchObservedRunningTime="2025-09-30 07:36:01.324464162 +0000 UTC m=+146.967370574" Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.351527 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:36:01 crc kubenswrapper[4760]: E0930 07:36:01.357799 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:36:01.857775777 +0000 UTC m=+147.500682189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.358605 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:36:01 crc kubenswrapper[4760]: E0930 07:36:01.359388 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:36:01.859380393 +0000 UTC m=+147.502286805 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.460001 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:36:01 crc kubenswrapper[4760]: E0930 07:36:01.460390 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:36:01.960376339 +0000 UTC m=+147.603282751 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.495338 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-82mws" podStartSLOduration=7.495320871 podStartE2EDuration="7.495320871s" podCreationTimestamp="2025-09-30 07:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:01.409694746 +0000 UTC m=+147.052601158" watchObservedRunningTime="2025-09-30 07:36:01.495320871 +0000 UTC m=+147.138227283" Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.561772 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:36:01 crc kubenswrapper[4760]: E0930 07:36:01.562132 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:36:02.062117496 +0000 UTC m=+147.705023918 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.597221 4760 patch_prober.go:28] interesting pod/router-default-5444994796-vjd5w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 07:36:01 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Sep 30 07:36:01 crc kubenswrapper[4760]: [+]process-running ok Sep 30 07:36:01 crc kubenswrapper[4760]: healthz check failed Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.597272 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vjd5w" podUID="eb4bc73c-b474-4efa-b348-63ff26045c24" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.663757 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:36:01 crc kubenswrapper[4760]: E0930 07:36:01.664557 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:36:02.164542882 +0000 UTC m=+147.807449294 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.765074 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:36:01 crc kubenswrapper[4760]: E0930 07:36:01.765477 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:36:02.265459476 +0000 UTC m=+147.908365888 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.866706 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:36:01 crc kubenswrapper[4760]: E0930 07:36:01.867029 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:36:02.367014238 +0000 UTC m=+148.009920650 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:01 crc kubenswrapper[4760]: I0930 07:36:01.968220 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:36:01 crc kubenswrapper[4760]: E0930 07:36:01.968925 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:36:02.46891339 +0000 UTC m=+148.111819802 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.069786 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:36:02 crc kubenswrapper[4760]: E0930 07:36:02.070084 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:36:02.57006936 +0000 UTC m=+148.212975772 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.168566 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2fn4r" Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.171160 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:36:02 crc kubenswrapper[4760]: E0930 07:36:02.171586 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:36:02.671565361 +0000 UTC m=+148.314471773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.204106 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-zrlmg" event={"ID":"de49f8fe-30f8-44ae-beaa-fe61cf7b0a16","Type":"ContainerStarted","Data":"fb78f348cd73687677888b2fcaf1da6e221c21373e6136aa001bb3e48c872b7a"} Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.220011 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-82mws" event={"ID":"d2195fee-f00a-408c-a44d-c74b59078ad7","Type":"ContainerStarted","Data":"dd9e4531198e7419009677d47bfc02d5a07265f1a4112ff6f22558a7bd481ae9"} Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.225524 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zpzhj" event={"ID":"6fddba3c-1ff7-42f4-99da-8f282c6095fc","Type":"ContainerStarted","Data":"f42066760d975594c26dcbceeacd77a3ad6642cd748b78ead94091d7216794fb"} Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.228857 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-76w2l" event={"ID":"86c4ea2a-21a5-46e4-bfd9-28d7a429b7ae","Type":"ContainerStarted","Data":"0c8eaef194ab5afa37220d68a02d03cf38ebaeac555426b8514d9dbcd10911f9"} Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.228882 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-76w2l" event={"ID":"86c4ea2a-21a5-46e4-bfd9-28d7a429b7ae","Type":"ContainerStarted","Data":"eec8f2d83c9c9ed74e95fa80e284881cdff07542a640b40430c7416152cd8bd5"} Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.229425 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-76w2l" Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.231708 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9gnwv" event={"ID":"9873ee29-2db0-462b-8c6c-6efc009193fa","Type":"ContainerStarted","Data":"e4cb3ec41695628893dd1237ce0caa7b1d52d8edc89a2e630fa8a8474e82d957"} Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.231733 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9gnwv" event={"ID":"9873ee29-2db0-462b-8c6c-6efc009193fa","Type":"ContainerStarted","Data":"63267ec8f010f26f5626ce08035826892cd06c917f42c765dab6046ad65b04ff"} Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.235938 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kkfmw" event={"ID":"23c1276d-8d7b-4ffa-9290-3bd09756c660","Type":"ContainerStarted","Data":"db5cda0a7ec115a83557724eaf06baf6583ff58d79ed8fa0e134d8c3ab6217de"} Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.241439 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hgcbk" event={"ID":"391a89e2-2467-4f7e-aa8d-d4c939845a67","Type":"ContainerStarted","Data":"4fd60aa78277f8fbbe3cced5aec642c7c7cc95b6f87928f287ccdb66bb0672fb"} Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.248199 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-n5tk5" event={"ID":"85616cb5-98c5-4296-858a-462f8ca42702","Type":"ContainerStarted","Data":"8ebc1c910081d2ad4e4977721fc286389a5e1db5884d4afb1d1e8536e7d6c353"} Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.248234 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-n5tk5" event={"ID":"85616cb5-98c5-4296-858a-462f8ca42702","Type":"ContainerStarted","Data":"16b9dc04fb173c7bbc71111d46759b62c17487d3f2a7246ee744ec21607cd004"} Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.253571 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-jx7lt" event={"ID":"3b428f41-cff0-421d-a763-987d15be26eb","Type":"ContainerStarted","Data":"d607e67fa027fd0dfe871e9d0905307e89b937537ca83da4b7c7f8387ec306c4"} Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.254487 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-jx7lt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.254519 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jx7lt" podUID="3b428f41-cff0-421d-a763-987d15be26eb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.264525 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7278d" event={"ID":"32814054-fdc5-432f-a901-43f758af1b44","Type":"ContainerStarted","Data":"b6029e8fb7fcb78d9f426f497c1079251dab20d8e71dc067ab1ab2b73e18c3ad"} Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.264560 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7278d" event={"ID":"32814054-fdc5-432f-a901-43f758af1b44","Type":"ContainerStarted","Data":"da3fa0355faab7a8a87d63ca2fb850f733ff1df1151f4b817b664100c36e1f94"} Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.270715 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-76w2l" podStartSLOduration=8.270700843 podStartE2EDuration="8.270700843s" podCreationTimestamp="2025-09-30 07:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:02.255829047 +0000 UTC m=+147.898735459" watchObservedRunningTime="2025-09-30 07:36:02.270700843 +0000 UTC m=+147.913607245" Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.271179 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-n5tk5" podStartSLOduration=127.271174817 podStartE2EDuration="2m7.271174817s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:02.26815713 +0000 UTC m=+147.911063542" watchObservedRunningTime="2025-09-30 07:36:02.271174817 +0000 UTC m=+147.914081229" Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.271686 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:36:02 crc kubenswrapper[4760]: E0930 07:36:02.272128 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:36:02.772114524 +0000 UTC m=+148.415020936 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.283512 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pn92b" Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.328861 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-7278d" podStartSLOduration=127.328846961 podStartE2EDuration="2m7.328846961s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:02.298708316 +0000 UTC m=+147.941614728" watchObservedRunningTime="2025-09-30 07:36:02.328846961 +0000 UTC m=+147.971753373" Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.378998 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:36:02 crc kubenswrapper[4760]: E0930 07:36:02.381609 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:36:02.881598753 +0000 UTC m=+148.524505165 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.481802 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:36:02 crc kubenswrapper[4760]: E0930 07:36:02.482434 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:36:02.982418574 +0000 UTC m=+148.625324986 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.583200 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:36:02 crc kubenswrapper[4760]: E0930 07:36:02.583549 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:36:03.083537104 +0000 UTC m=+148.726443516 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.594709 4760 patch_prober.go:28] interesting pod/router-default-5444994796-vjd5w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 07:36:02 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Sep 30 07:36:02 crc kubenswrapper[4760]: [+]process-running ok Sep 30 07:36:02 crc kubenswrapper[4760]: healthz check failed Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.595131 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vjd5w" podUID="eb4bc73c-b474-4efa-b348-63ff26045c24" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.684774 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:36:02 crc kubenswrapper[4760]: E0930 07:36:02.685123 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:36:03.185109026 +0000 UTC m=+148.828015438 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.786619 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.786715 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.786740 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:36:02 crc kubenswrapper[4760]: E0930 07:36:02.787810 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:36:03.287793211 +0000 UTC m=+148.930699623 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.788016 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.795599 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-zwnlf" Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.804114 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.887695 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:36:02 crc kubenswrapper[4760]: E0930 07:36:02.887890 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:36:03.38785783 +0000 UTC m=+149.030764252 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.887925 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.887958 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.887991 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:36:02 crc kubenswrapper[4760]: E0930 07:36:02.888610 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:36:03.388593901 +0000 UTC m=+149.031500303 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.912182 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.912895 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.988827 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:36:02 crc kubenswrapper[4760]: E0930 07:36:02.988996 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:36:03.488969239 +0000 UTC m=+149.131875651 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:02 crc kubenswrapper[4760]: I0930 07:36:02.989480 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:36:02 crc kubenswrapper[4760]: E0930 07:36:02.989789 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:36:03.489775862 +0000 UTC m=+149.132682274 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.085471 4760 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.090198 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:36:03 crc kubenswrapper[4760]: E0930 07:36:03.090874 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:36:03.590854591 +0000 UTC m=+149.233761023 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.097137 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.188150 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.192134 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:36:03 crc kubenswrapper[4760]: E0930 07:36:03.192530 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 07:36:03.692514836 +0000 UTC m=+149.335421248 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgrsq" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.202480 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.270592 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-zrlmg" event={"ID":"de49f8fe-30f8-44ae-beaa-fe61cf7b0a16","Type":"ContainerStarted","Data":"f814689e438a2b36e68cfe63855e8c0977f2ad34aa5b959b4d463079a26bd59c"} Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.270865 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-zrlmg" event={"ID":"de49f8fe-30f8-44ae-beaa-fe61cf7b0a16","Type":"ContainerStarted","Data":"c5c605d89a6321d4b17e9681b44fab719e8b5e0664e9dfe21f63a8eae19cdcd1"} Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.270875 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-zrlmg" event={"ID":"de49f8fe-30f8-44ae-beaa-fe61cf7b0a16","Type":"ContainerStarted","Data":"472d3e1bf98548adc35feb917f79bd48708b38afd8d3c030b22e28fa0a54c774"} Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.273408 4760 generic.go:334] "Generic (PLEG): container finished" podID="db316304-e02f-43df-b4e2-6cdd6ee3b7eb" containerID="a6a4df1c779996b80c57ea79d6eadfb720c63d9facc8b5e93311cdc717996acc" exitCode=0 Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.273540 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-w8n6p" event={"ID":"db316304-e02f-43df-b4e2-6cdd6ee3b7eb","Type":"ContainerDied","Data":"a6a4df1c779996b80c57ea79d6eadfb720c63d9facc8b5e93311cdc717996acc"} Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.275069 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-jx7lt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.275121 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jx7lt" podUID="3b428f41-cff0-421d-a763-987d15be26eb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.287177 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p96nl" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.289980 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-zrlmg" podStartSLOduration=9.28995553 podStartE2EDuration="9.28995553s" podCreationTimestamp="2025-09-30 07:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:03.284698559 +0000 UTC m=+148.927604991" watchObservedRunningTime="2025-09-30 07:36:03.28995553 +0000 UTC m=+148.932861942" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.292949 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:36:03 crc kubenswrapper[4760]: E0930 07:36:03.293277 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 07:36:03.793258324 +0000 UTC m=+149.436164736 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.313234 4760 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-09-30T07:36:03.085666502Z","Handler":null,"Name":""} Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.315403 4760 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.315420 4760 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.394013 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.404798 4760 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.404838 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.438274 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgrsq\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.496119 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.500061 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.511686 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tjdgw"] Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.512549 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tjdgw" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.516008 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.533163 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tjdgw"] Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.551390 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.595602 4760 patch_prober.go:28] interesting pod/router-default-5444994796-vjd5w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 07:36:03 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Sep 30 07:36:03 crc kubenswrapper[4760]: [+]process-running ok Sep 30 07:36:03 crc kubenswrapper[4760]: healthz check failed Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.595650 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vjd5w" podUID="eb4bc73c-b474-4efa-b348-63ff26045c24" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.598509 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnfp9\" (UniqueName: \"kubernetes.io/projected/a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc-kube-api-access-hnfp9\") pod \"community-operators-tjdgw\" (UID: \"a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc\") " pod="openshift-marketplace/community-operators-tjdgw" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.598565 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc-utilities\") pod \"community-operators-tjdgw\" (UID: \"a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc\") " pod="openshift-marketplace/community-operators-tjdgw" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.598600 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc-catalog-content\") pod \"community-operators-tjdgw\" (UID: \"a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc\") " pod="openshift-marketplace/community-operators-tjdgw" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.700103 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc-catalog-content\") pod \"community-operators-tjdgw\" (UID: \"a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc\") " pod="openshift-marketplace/community-operators-tjdgw" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.700171 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnfp9\" (UniqueName: \"kubernetes.io/projected/a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc-kube-api-access-hnfp9\") pod \"community-operators-tjdgw\" (UID: \"a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc\") " pod="openshift-marketplace/community-operators-tjdgw" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.700207 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc-utilities\") pod \"community-operators-tjdgw\" (UID: \"a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc\") " pod="openshift-marketplace/community-operators-tjdgw" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.700653 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc-utilities\") pod \"community-operators-tjdgw\" (UID: \"a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc\") " pod="openshift-marketplace/community-operators-tjdgw" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.700657 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc-catalog-content\") pod \"community-operators-tjdgw\" (UID: \"a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc\") " pod="openshift-marketplace/community-operators-tjdgw" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.706071 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-52kxf"] Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.707422 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52kxf" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.711199 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.717487 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-52kxf"] Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.733742 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnfp9\" (UniqueName: \"kubernetes.io/projected/a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc-kube-api-access-hnfp9\") pod \"community-operators-tjdgw\" (UID: \"a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc\") " pod="openshift-marketplace/community-operators-tjdgw" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.783693 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vgrsq"] Sep 30 07:36:03 crc kubenswrapper[4760]: W0930 07:36:03.788131 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd4422f1_0405_44b0_9256_fec03b6dc2f0.slice/crio-7129f2f7b0843c528b48a3ea2f621f159c41dd2f894ff0b1002cfe8804b89434 WatchSource:0}: Error finding container 7129f2f7b0843c528b48a3ea2f621f159c41dd2f894ff0b1002cfe8804b89434: Status 404 returned error can't find the container with id 7129f2f7b0843c528b48a3ea2f621f159c41dd2f894ff0b1002cfe8804b89434 Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.801257 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qqp5\" (UniqueName: \"kubernetes.io/projected/ffeadfaf-9c4c-4dce-99d5-36f0e42571d7-kube-api-access-6qqp5\") pod \"certified-operators-52kxf\" (UID: \"ffeadfaf-9c4c-4dce-99d5-36f0e42571d7\") " pod="openshift-marketplace/certified-operators-52kxf" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.801321 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffeadfaf-9c4c-4dce-99d5-36f0e42571d7-catalog-content\") pod \"certified-operators-52kxf\" (UID: \"ffeadfaf-9c4c-4dce-99d5-36f0e42571d7\") " pod="openshift-marketplace/certified-operators-52kxf" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.801370 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffeadfaf-9c4c-4dce-99d5-36f0e42571d7-utilities\") pod \"certified-operators-52kxf\" (UID: \"ffeadfaf-9c4c-4dce-99d5-36f0e42571d7\") " pod="openshift-marketplace/certified-operators-52kxf" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.835808 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tjdgw" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.903888 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qqp5\" (UniqueName: \"kubernetes.io/projected/ffeadfaf-9c4c-4dce-99d5-36f0e42571d7-kube-api-access-6qqp5\") pod \"certified-operators-52kxf\" (UID: \"ffeadfaf-9c4c-4dce-99d5-36f0e42571d7\") " pod="openshift-marketplace/certified-operators-52kxf" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.903930 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffeadfaf-9c4c-4dce-99d5-36f0e42571d7-catalog-content\") pod \"certified-operators-52kxf\" (UID: \"ffeadfaf-9c4c-4dce-99d5-36f0e42571d7\") " pod="openshift-marketplace/certified-operators-52kxf" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.903980 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffeadfaf-9c4c-4dce-99d5-36f0e42571d7-utilities\") pod \"certified-operators-52kxf\" (UID: \"ffeadfaf-9c4c-4dce-99d5-36f0e42571d7\") " pod="openshift-marketplace/certified-operators-52kxf" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.904380 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffeadfaf-9c4c-4dce-99d5-36f0e42571d7-utilities\") pod \"certified-operators-52kxf\" (UID: \"ffeadfaf-9c4c-4dce-99d5-36f0e42571d7\") " pod="openshift-marketplace/certified-operators-52kxf" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.904608 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffeadfaf-9c4c-4dce-99d5-36f0e42571d7-catalog-content\") pod \"certified-operators-52kxf\" (UID: \"ffeadfaf-9c4c-4dce-99d5-36f0e42571d7\") " pod="openshift-marketplace/certified-operators-52kxf" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.909578 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s4pkh"] Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.910489 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s4pkh" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.932417 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qqp5\" (UniqueName: \"kubernetes.io/projected/ffeadfaf-9c4c-4dce-99d5-36f0e42571d7-kube-api-access-6qqp5\") pod \"certified-operators-52kxf\" (UID: \"ffeadfaf-9c4c-4dce-99d5-36f0e42571d7\") " pod="openshift-marketplace/certified-operators-52kxf" Sep 30 07:36:03 crc kubenswrapper[4760]: I0930 07:36:03.933878 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s4pkh"] Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.005003 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58cb5713-4587-4909-9aaa-5eae3a314c9e-utilities\") pod \"community-operators-s4pkh\" (UID: \"58cb5713-4587-4909-9aaa-5eae3a314c9e\") " pod="openshift-marketplace/community-operators-s4pkh" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.005127 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58cb5713-4587-4909-9aaa-5eae3a314c9e-catalog-content\") pod \"community-operators-s4pkh\" (UID: \"58cb5713-4587-4909-9aaa-5eae3a314c9e\") " pod="openshift-marketplace/community-operators-s4pkh" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.005214 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpw29\" (UniqueName: \"kubernetes.io/projected/58cb5713-4587-4909-9aaa-5eae3a314c9e-kube-api-access-bpw29\") pod \"community-operators-s4pkh\" (UID: \"58cb5713-4587-4909-9aaa-5eae3a314c9e\") " pod="openshift-marketplace/community-operators-s4pkh" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.028679 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tjdgw"] Sep 30 07:36:04 crc kubenswrapper[4760]: W0930 07:36:04.033618 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4ec1e48_4ef3_4cfc_9eb2_cd3a0d47e6dc.slice/crio-dcc0e7a7a638fe1bb7c2ff6bddb3da6cf762f578d372d30a1bcaffdf99194f71 WatchSource:0}: Error finding container dcc0e7a7a638fe1bb7c2ff6bddb3da6cf762f578d372d30a1bcaffdf99194f71: Status 404 returned error can't find the container with id dcc0e7a7a638fe1bb7c2ff6bddb3da6cf762f578d372d30a1bcaffdf99194f71 Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.048294 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52kxf" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.104544 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8f8g7"] Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.105842 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58cb5713-4587-4909-9aaa-5eae3a314c9e-utilities\") pod \"community-operators-s4pkh\" (UID: \"58cb5713-4587-4909-9aaa-5eae3a314c9e\") " pod="openshift-marketplace/community-operators-s4pkh" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.105899 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58cb5713-4587-4909-9aaa-5eae3a314c9e-catalog-content\") pod \"community-operators-s4pkh\" (UID: \"58cb5713-4587-4909-9aaa-5eae3a314c9e\") " pod="openshift-marketplace/community-operators-s4pkh" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.105928 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpw29\" (UniqueName: \"kubernetes.io/projected/58cb5713-4587-4909-9aaa-5eae3a314c9e-kube-api-access-bpw29\") pod \"community-operators-s4pkh\" (UID: \"58cb5713-4587-4909-9aaa-5eae3a314c9e\") " pod="openshift-marketplace/community-operators-s4pkh" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.106336 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58cb5713-4587-4909-9aaa-5eae3a314c9e-utilities\") pod \"community-operators-s4pkh\" (UID: \"58cb5713-4587-4909-9aaa-5eae3a314c9e\") " pod="openshift-marketplace/community-operators-s4pkh" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.106457 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58cb5713-4587-4909-9aaa-5eae3a314c9e-catalog-content\") pod \"community-operators-s4pkh\" (UID: \"58cb5713-4587-4909-9aaa-5eae3a314c9e\") " pod="openshift-marketplace/community-operators-s4pkh" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.106465 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8f8g7" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.114597 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8f8g7"] Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.142255 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpw29\" (UniqueName: \"kubernetes.io/projected/58cb5713-4587-4909-9aaa-5eae3a314c9e-kube-api-access-bpw29\") pod \"community-operators-s4pkh\" (UID: \"58cb5713-4587-4909-9aaa-5eae3a314c9e\") " pod="openshift-marketplace/community-operators-s4pkh" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.207242 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93384c2b-6a0e-41e8-a873-501fb43090a5-catalog-content\") pod \"certified-operators-8f8g7\" (UID: \"93384c2b-6a0e-41e8-a873-501fb43090a5\") " pod="openshift-marketplace/certified-operators-8f8g7" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.207322 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqfz7\" (UniqueName: \"kubernetes.io/projected/93384c2b-6a0e-41e8-a873-501fb43090a5-kube-api-access-nqfz7\") pod \"certified-operators-8f8g7\" (UID: \"93384c2b-6a0e-41e8-a873-501fb43090a5\") " pod="openshift-marketplace/certified-operators-8f8g7" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.207409 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93384c2b-6a0e-41e8-a873-501fb43090a5-utilities\") pod \"certified-operators-8f8g7\" (UID: \"93384c2b-6a0e-41e8-a873-501fb43090a5\") " pod="openshift-marketplace/certified-operators-8f8g7" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.235912 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.236529 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.238763 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.238763 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.239620 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s4pkh" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.247139 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.252477 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-52kxf"] Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.286453 4760 generic.go:334] "Generic (PLEG): container finished" podID="a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc" containerID="86ab3b367ffcddf6e73455becb5c4c38a481f14d3ba6d6f3b256b281a1e160ab" exitCode=0 Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.286503 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjdgw" event={"ID":"a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc","Type":"ContainerDied","Data":"86ab3b367ffcddf6e73455becb5c4c38a481f14d3ba6d6f3b256b281a1e160ab"} Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.286527 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjdgw" event={"ID":"a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc","Type":"ContainerStarted","Data":"dcc0e7a7a638fe1bb7c2ff6bddb3da6cf762f578d372d30a1bcaffdf99194f71"} Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.288831 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.290318 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"08c3b43db87d73f1993b1754d5b60050fe14a987df7dbffe3da85cf1b3c1960c"} Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.290344 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c03f46e0ca777e7e7d907df5091d16cf18ace43f6db167eed6933fd68147322c"} Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.290740 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.293991 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"26d57e0af606c8490f35d014d0bfcba53fb5fee70b5026b392cd88cefe813e07"} Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.294013 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4ae9519552e3753f7d64d471d26440557ea963035698c9c78d7230b00c7cf18c"} Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.298926 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c8673a71ab989b7e72cd35a7176d2d3182aa0f57abcdc7bbcab7d3bfda43289f"} Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.298955 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"5ce974c2b902ae146601de960cd6df3b92a853d7cd8dca8a5f415afa81b3dba8"} Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.306942 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" event={"ID":"cd4422f1-0405-44b0-9256-fec03b6dc2f0","Type":"ContainerStarted","Data":"821a3622ecd7a40223be57e113504644b686b8fd664caaf9cea9bfa727166a49"} Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.306973 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" event={"ID":"cd4422f1-0405-44b0-9256-fec03b6dc2f0","Type":"ContainerStarted","Data":"7129f2f7b0843c528b48a3ea2f621f159c41dd2f894ff0b1002cfe8804b89434"} Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.308387 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3c080c6f-c01c-4838-bc54-69cfae495f77-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3c080c6f-c01c-4838-bc54-69cfae495f77\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.308410 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3c080c6f-c01c-4838-bc54-69cfae495f77-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3c080c6f-c01c-4838-bc54-69cfae495f77\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.308435 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93384c2b-6a0e-41e8-a873-501fb43090a5-catalog-content\") pod \"certified-operators-8f8g7\" (UID: \"93384c2b-6a0e-41e8-a873-501fb43090a5\") " pod="openshift-marketplace/certified-operators-8f8g7" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.308461 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqfz7\" (UniqueName: \"kubernetes.io/projected/93384c2b-6a0e-41e8-a873-501fb43090a5-kube-api-access-nqfz7\") pod \"certified-operators-8f8g7\" (UID: \"93384c2b-6a0e-41e8-a873-501fb43090a5\") " pod="openshift-marketplace/certified-operators-8f8g7" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.308514 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93384c2b-6a0e-41e8-a873-501fb43090a5-utilities\") pod \"certified-operators-8f8g7\" (UID: \"93384c2b-6a0e-41e8-a873-501fb43090a5\") " pod="openshift-marketplace/certified-operators-8f8g7" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.308967 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93384c2b-6a0e-41e8-a873-501fb43090a5-utilities\") pod \"certified-operators-8f8g7\" (UID: \"93384c2b-6a0e-41e8-a873-501fb43090a5\") " pod="openshift-marketplace/certified-operators-8f8g7" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.309000 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.309237 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93384c2b-6a0e-41e8-a873-501fb43090a5-catalog-content\") pod \"certified-operators-8f8g7\" (UID: \"93384c2b-6a0e-41e8-a873-501fb43090a5\") " pod="openshift-marketplace/certified-operators-8f8g7" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.312723 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52kxf" event={"ID":"ffeadfaf-9c4c-4dce-99d5-36f0e42571d7","Type":"ContainerStarted","Data":"3caeb18152aba8c700b54cb3b3a6c4df897571b6c28e0f3c667f1700af988929"} Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.328118 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqfz7\" (UniqueName: \"kubernetes.io/projected/93384c2b-6a0e-41e8-a873-501fb43090a5-kube-api-access-nqfz7\") pod \"certified-operators-8f8g7\" (UID: \"93384c2b-6a0e-41e8-a873-501fb43090a5\") " pod="openshift-marketplace/certified-operators-8f8g7" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.394050 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" podStartSLOduration=129.394033358 podStartE2EDuration="2m9.394033358s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:04.391178656 +0000 UTC m=+150.034085068" watchObservedRunningTime="2025-09-30 07:36:04.394033358 +0000 UTC m=+150.036939770" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.410333 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3c080c6f-c01c-4838-bc54-69cfae495f77-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3c080c6f-c01c-4838-bc54-69cfae495f77\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.410386 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3c080c6f-c01c-4838-bc54-69cfae495f77-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3c080c6f-c01c-4838-bc54-69cfae495f77\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.410431 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3c080c6f-c01c-4838-bc54-69cfae495f77-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3c080c6f-c01c-4838-bc54-69cfae495f77\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.422602 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8f8g7" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.428960 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3c080c6f-c01c-4838-bc54-69cfae495f77-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3c080c6f-c01c-4838-bc54-69cfae495f77\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.538259 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s4pkh"] Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.560354 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.593284 4760 patch_prober.go:28] interesting pod/router-default-5444994796-vjd5w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 07:36:04 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Sep 30 07:36:04 crc kubenswrapper[4760]: [+]process-running ok Sep 30 07:36:04 crc kubenswrapper[4760]: healthz check failed Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.593360 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vjd5w" podUID="eb4bc73c-b474-4efa-b348-63ff26045c24" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.652702 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8f8g7"] Sep 30 07:36:04 crc kubenswrapper[4760]: W0930 07:36:04.683443 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93384c2b_6a0e_41e8_a873_501fb43090a5.slice/crio-05c8f101fa3f67c9d42e072cce16ae9ef5c75e752ca1eb4b026ae3c1d0296d5b WatchSource:0}: Error finding container 05c8f101fa3f67c9d42e072cce16ae9ef5c75e752ca1eb4b026ae3c1d0296d5b: Status 404 returned error can't find the container with id 05c8f101fa3f67c9d42e072cce16ae9ef5c75e752ca1eb4b026ae3c1d0296d5b Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.747033 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-w8n6p" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.804486 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.818139 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db316304-e02f-43df-b4e2-6cdd6ee3b7eb-secret-volume\") pod \"db316304-e02f-43df-b4e2-6cdd6ee3b7eb\" (UID: \"db316304-e02f-43df-b4e2-6cdd6ee3b7eb\") " Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.818264 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4vvx\" (UniqueName: \"kubernetes.io/projected/db316304-e02f-43df-b4e2-6cdd6ee3b7eb-kube-api-access-s4vvx\") pod \"db316304-e02f-43df-b4e2-6cdd6ee3b7eb\" (UID: \"db316304-e02f-43df-b4e2-6cdd6ee3b7eb\") " Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.818315 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db316304-e02f-43df-b4e2-6cdd6ee3b7eb-config-volume\") pod \"db316304-e02f-43df-b4e2-6cdd6ee3b7eb\" (UID: \"db316304-e02f-43df-b4e2-6cdd6ee3b7eb\") " Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.820151 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db316304-e02f-43df-b4e2-6cdd6ee3b7eb-config-volume" (OuterVolumeSpecName: "config-volume") pod "db316304-e02f-43df-b4e2-6cdd6ee3b7eb" (UID: "db316304-e02f-43df-b4e2-6cdd6ee3b7eb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.824332 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db316304-e02f-43df-b4e2-6cdd6ee3b7eb-kube-api-access-s4vvx" (OuterVolumeSpecName: "kube-api-access-s4vvx") pod "db316304-e02f-43df-b4e2-6cdd6ee3b7eb" (UID: "db316304-e02f-43df-b4e2-6cdd6ee3b7eb"). InnerVolumeSpecName "kube-api-access-s4vvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.827132 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db316304-e02f-43df-b4e2-6cdd6ee3b7eb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "db316304-e02f-43df-b4e2-6cdd6ee3b7eb" (UID: "db316304-e02f-43df-b4e2-6cdd6ee3b7eb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:36:04 crc kubenswrapper[4760]: W0930 07:36:04.829052 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3c080c6f_c01c_4838_bc54_69cfae495f77.slice/crio-35844374756cf22593ca36b43bb808b1f7a725e40327a3f8d2e127b702c3b855 WatchSource:0}: Error finding container 35844374756cf22593ca36b43bb808b1f7a725e40327a3f8d2e127b702c3b855: Status 404 returned error can't find the container with id 35844374756cf22593ca36b43bb808b1f7a725e40327a3f8d2e127b702c3b855 Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.919479 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4vvx\" (UniqueName: \"kubernetes.io/projected/db316304-e02f-43df-b4e2-6cdd6ee3b7eb-kube-api-access-s4vvx\") on node \"crc\" DevicePath \"\"" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.919734 4760 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db316304-e02f-43df-b4e2-6cdd6ee3b7eb-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 07:36:04 crc kubenswrapper[4760]: I0930 07:36:04.919744 4760 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db316304-e02f-43df-b4e2-6cdd6ee3b7eb-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.074116 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.320988 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-w8n6p" event={"ID":"db316304-e02f-43df-b4e2-6cdd6ee3b7eb","Type":"ContainerDied","Data":"ece71c94ecade083af1a35bc8d5ffc8421c6053f3a8debb97bde3a68724e8cad"} Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.321027 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ece71c94ecade083af1a35bc8d5ffc8421c6053f3a8debb97bde3a68724e8cad" Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.321050 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-w8n6p" Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.322819 4760 generic.go:334] "Generic (PLEG): container finished" podID="93384c2b-6a0e-41e8-a873-501fb43090a5" containerID="665b5e019ceb22d177916e0ad39b164434ac71f0f641b24a716f4e4c0015aa76" exitCode=0 Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.323386 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8f8g7" event={"ID":"93384c2b-6a0e-41e8-a873-501fb43090a5","Type":"ContainerDied","Data":"665b5e019ceb22d177916e0ad39b164434ac71f0f641b24a716f4e4c0015aa76"} Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.323430 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8f8g7" event={"ID":"93384c2b-6a0e-41e8-a873-501fb43090a5","Type":"ContainerStarted","Data":"05c8f101fa3f67c9d42e072cce16ae9ef5c75e752ca1eb4b026ae3c1d0296d5b"} Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.325845 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3c080c6f-c01c-4838-bc54-69cfae495f77","Type":"ContainerStarted","Data":"5ce5e888967d775e41b83e50408f558d258ad673c7c79d75c147d926428214c1"} Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.325887 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3c080c6f-c01c-4838-bc54-69cfae495f77","Type":"ContainerStarted","Data":"35844374756cf22593ca36b43bb808b1f7a725e40327a3f8d2e127b702c3b855"} Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.335706 4760 generic.go:334] "Generic (PLEG): container finished" podID="ffeadfaf-9c4c-4dce-99d5-36f0e42571d7" containerID="391940cdfb659af1e0e3bf92ae30609a677b891832f2bfdbc81853bde0e1d645" exitCode=0 Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.336518 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52kxf" event={"ID":"ffeadfaf-9c4c-4dce-99d5-36f0e42571d7","Type":"ContainerDied","Data":"391940cdfb659af1e0e3bf92ae30609a677b891832f2bfdbc81853bde0e1d645"} Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.345548 4760 generic.go:334] "Generic (PLEG): container finished" podID="58cb5713-4587-4909-9aaa-5eae3a314c9e" containerID="edc0c4ac0dd66b6ae4c241d280e5685d9b96ebcf01d4765777e9f1a3a2dda9a4" exitCode=0 Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.346625 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s4pkh" event={"ID":"58cb5713-4587-4909-9aaa-5eae3a314c9e","Type":"ContainerDied","Data":"edc0c4ac0dd66b6ae4c241d280e5685d9b96ebcf01d4765777e9f1a3a2dda9a4"} Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.346649 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s4pkh" event={"ID":"58cb5713-4587-4909-9aaa-5eae3a314c9e","Type":"ContainerStarted","Data":"c056d9e7473f3b40a0c5d9293f8348d9bdbb7664756d2392da5a7ed137462847"} Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.371968 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.371947458 podStartE2EDuration="1.371947458s" podCreationTimestamp="2025-09-30 07:36:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:05.35910872 +0000 UTC m=+151.002015132" watchObservedRunningTime="2025-09-30 07:36:05.371947458 +0000 UTC m=+151.014853870" Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.514218 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dp82r"] Sep 30 07:36:05 crc kubenswrapper[4760]: E0930 07:36:05.514504 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db316304-e02f-43df-b4e2-6cdd6ee3b7eb" containerName="collect-profiles" Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.514521 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="db316304-e02f-43df-b4e2-6cdd6ee3b7eb" containerName="collect-profiles" Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.514639 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="db316304-e02f-43df-b4e2-6cdd6ee3b7eb" containerName="collect-profiles" Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.515534 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dp82r" Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.518478 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.520939 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dp82r"] Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.589797 4760 patch_prober.go:28] interesting pod/router-default-5444994796-vjd5w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 07:36:05 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Sep 30 07:36:05 crc kubenswrapper[4760]: [+]process-running ok Sep 30 07:36:05 crc kubenswrapper[4760]: healthz check failed Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.589843 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vjd5w" podUID="eb4bc73c-b474-4efa-b348-63ff26045c24" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.628799 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36f4e442-c66f-4b6b-b6b3-98c8fd676ba7-catalog-content\") pod \"redhat-marketplace-dp82r\" (UID: \"36f4e442-c66f-4b6b-b6b3-98c8fd676ba7\") " pod="openshift-marketplace/redhat-marketplace-dp82r" Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.628885 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36f4e442-c66f-4b6b-b6b3-98c8fd676ba7-utilities\") pod \"redhat-marketplace-dp82r\" (UID: \"36f4e442-c66f-4b6b-b6b3-98c8fd676ba7\") " pod="openshift-marketplace/redhat-marketplace-dp82r" Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.628914 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r4bg\" (UniqueName: \"kubernetes.io/projected/36f4e442-c66f-4b6b-b6b3-98c8fd676ba7-kube-api-access-7r4bg\") pod \"redhat-marketplace-dp82r\" (UID: \"36f4e442-c66f-4b6b-b6b3-98c8fd676ba7\") " pod="openshift-marketplace/redhat-marketplace-dp82r" Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.660220 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.667974 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-bhzlk" Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.729585 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36f4e442-c66f-4b6b-b6b3-98c8fd676ba7-catalog-content\") pod \"redhat-marketplace-dp82r\" (UID: \"36f4e442-c66f-4b6b-b6b3-98c8fd676ba7\") " pod="openshift-marketplace/redhat-marketplace-dp82r" Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.729734 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36f4e442-c66f-4b6b-b6b3-98c8fd676ba7-utilities\") pod \"redhat-marketplace-dp82r\" (UID: \"36f4e442-c66f-4b6b-b6b3-98c8fd676ba7\") " pod="openshift-marketplace/redhat-marketplace-dp82r" Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.729762 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r4bg\" (UniqueName: \"kubernetes.io/projected/36f4e442-c66f-4b6b-b6b3-98c8fd676ba7-kube-api-access-7r4bg\") pod \"redhat-marketplace-dp82r\" (UID: \"36f4e442-c66f-4b6b-b6b3-98c8fd676ba7\") " pod="openshift-marketplace/redhat-marketplace-dp82r" Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.731495 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36f4e442-c66f-4b6b-b6b3-98c8fd676ba7-catalog-content\") pod \"redhat-marketplace-dp82r\" (UID: \"36f4e442-c66f-4b6b-b6b3-98c8fd676ba7\") " pod="openshift-marketplace/redhat-marketplace-dp82r" Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.731536 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36f4e442-c66f-4b6b-b6b3-98c8fd676ba7-utilities\") pod \"redhat-marketplace-dp82r\" (UID: \"36f4e442-c66f-4b6b-b6b3-98c8fd676ba7\") " pod="openshift-marketplace/redhat-marketplace-dp82r" Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.772458 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r4bg\" (UniqueName: \"kubernetes.io/projected/36f4e442-c66f-4b6b-b6b3-98c8fd676ba7-kube-api-access-7r4bg\") pod \"redhat-marketplace-dp82r\" (UID: \"36f4e442-c66f-4b6b-b6b3-98c8fd676ba7\") " pod="openshift-marketplace/redhat-marketplace-dp82r" Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.882602 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dp82r" Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.919782 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xk9tc"] Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.920724 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xk9tc" Sep 30 07:36:05 crc kubenswrapper[4760]: I0930 07:36:05.937242 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xk9tc"] Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.035729 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a700923-3c70-481e-9fad-c9de1e186301-catalog-content\") pod \"redhat-marketplace-xk9tc\" (UID: \"8a700923-3c70-481e-9fad-c9de1e186301\") " pod="openshift-marketplace/redhat-marketplace-xk9tc" Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.035775 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a700923-3c70-481e-9fad-c9de1e186301-utilities\") pod \"redhat-marketplace-xk9tc\" (UID: \"8a700923-3c70-481e-9fad-c9de1e186301\") " pod="openshift-marketplace/redhat-marketplace-xk9tc" Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.035817 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-576xx\" (UniqueName: \"kubernetes.io/projected/8a700923-3c70-481e-9fad-c9de1e186301-kube-api-access-576xx\") pod \"redhat-marketplace-xk9tc\" (UID: \"8a700923-3c70-481e-9fad-c9de1e186301\") " pod="openshift-marketplace/redhat-marketplace-xk9tc" Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.137277 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-576xx\" (UniqueName: \"kubernetes.io/projected/8a700923-3c70-481e-9fad-c9de1e186301-kube-api-access-576xx\") pod \"redhat-marketplace-xk9tc\" (UID: \"8a700923-3c70-481e-9fad-c9de1e186301\") " pod="openshift-marketplace/redhat-marketplace-xk9tc" Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.137432 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a700923-3c70-481e-9fad-c9de1e186301-catalog-content\") pod \"redhat-marketplace-xk9tc\" (UID: \"8a700923-3c70-481e-9fad-c9de1e186301\") " pod="openshift-marketplace/redhat-marketplace-xk9tc" Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.137459 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a700923-3c70-481e-9fad-c9de1e186301-utilities\") pod \"redhat-marketplace-xk9tc\" (UID: \"8a700923-3c70-481e-9fad-c9de1e186301\") " pod="openshift-marketplace/redhat-marketplace-xk9tc" Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.138399 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a700923-3c70-481e-9fad-c9de1e186301-utilities\") pod \"redhat-marketplace-xk9tc\" (UID: \"8a700923-3c70-481e-9fad-c9de1e186301\") " pod="openshift-marketplace/redhat-marketplace-xk9tc" Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.138505 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a700923-3c70-481e-9fad-c9de1e186301-catalog-content\") pod \"redhat-marketplace-xk9tc\" (UID: \"8a700923-3c70-481e-9fad-c9de1e186301\") " pod="openshift-marketplace/redhat-marketplace-xk9tc" Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.162767 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-576xx\" (UniqueName: \"kubernetes.io/projected/8a700923-3c70-481e-9fad-c9de1e186301-kube-api-access-576xx\") pod \"redhat-marketplace-xk9tc\" (UID: \"8a700923-3c70-481e-9fad-c9de1e186301\") " pod="openshift-marketplace/redhat-marketplace-xk9tc" Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.262328 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dp82r"] Sep 30 07:36:06 crc kubenswrapper[4760]: W0930 07:36:06.271320 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36f4e442_c66f_4b6b_b6b3_98c8fd676ba7.slice/crio-c976640224ed8f3d13b6f420a937e00e348c1a161a0a6a34b5c04f7622a1f039 WatchSource:0}: Error finding container c976640224ed8f3d13b6f420a937e00e348c1a161a0a6a34b5c04f7622a1f039: Status 404 returned error can't find the container with id c976640224ed8f3d13b6f420a937e00e348c1a161a0a6a34b5c04f7622a1f039 Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.296816 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xk9tc" Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.354036 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dp82r" event={"ID":"36f4e442-c66f-4b6b-b6b3-98c8fd676ba7","Type":"ContainerStarted","Data":"c976640224ed8f3d13b6f420a937e00e348c1a161a0a6a34b5c04f7622a1f039"} Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.356550 4760 generic.go:334] "Generic (PLEG): container finished" podID="3c080c6f-c01c-4838-bc54-69cfae495f77" containerID="5ce5e888967d775e41b83e50408f558d258ad673c7c79d75c147d926428214c1" exitCode=0 Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.357352 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3c080c6f-c01c-4838-bc54-69cfae495f77","Type":"ContainerDied","Data":"5ce5e888967d775e41b83e50408f558d258ad673c7c79d75c147d926428214c1"} Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.541725 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xk9tc"] Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.590608 4760 patch_prober.go:28] interesting pod/router-default-5444994796-vjd5w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 07:36:06 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Sep 30 07:36:06 crc kubenswrapper[4760]: [+]process-running ok Sep 30 07:36:06 crc kubenswrapper[4760]: healthz check failed Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.590670 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vjd5w" podUID="eb4bc73c-b474-4efa-b348-63ff26045c24" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.706825 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zj9hk"] Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.708516 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zj9hk" Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.711556 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.716865 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zj9hk"] Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.750019 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97af4546-9849-4500-a18e-994ec8158af0-catalog-content\") pod \"redhat-operators-zj9hk\" (UID: \"97af4546-9849-4500-a18e-994ec8158af0\") " pod="openshift-marketplace/redhat-operators-zj9hk" Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.750071 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9242t\" (UniqueName: \"kubernetes.io/projected/97af4546-9849-4500-a18e-994ec8158af0-kube-api-access-9242t\") pod \"redhat-operators-zj9hk\" (UID: \"97af4546-9849-4500-a18e-994ec8158af0\") " pod="openshift-marketplace/redhat-operators-zj9hk" Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.750098 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97af4546-9849-4500-a18e-994ec8158af0-utilities\") pod \"redhat-operators-zj9hk\" (UID: \"97af4546-9849-4500-a18e-994ec8158af0\") " pod="openshift-marketplace/redhat-operators-zj9hk" Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.851210 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97af4546-9849-4500-a18e-994ec8158af0-catalog-content\") pod \"redhat-operators-zj9hk\" (UID: \"97af4546-9849-4500-a18e-994ec8158af0\") " pod="openshift-marketplace/redhat-operators-zj9hk" Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.851255 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9242t\" (UniqueName: \"kubernetes.io/projected/97af4546-9849-4500-a18e-994ec8158af0-kube-api-access-9242t\") pod \"redhat-operators-zj9hk\" (UID: \"97af4546-9849-4500-a18e-994ec8158af0\") " pod="openshift-marketplace/redhat-operators-zj9hk" Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.851280 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97af4546-9849-4500-a18e-994ec8158af0-utilities\") pod \"redhat-operators-zj9hk\" (UID: \"97af4546-9849-4500-a18e-994ec8158af0\") " pod="openshift-marketplace/redhat-operators-zj9hk" Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.851817 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97af4546-9849-4500-a18e-994ec8158af0-catalog-content\") pod \"redhat-operators-zj9hk\" (UID: \"97af4546-9849-4500-a18e-994ec8158af0\") " pod="openshift-marketplace/redhat-operators-zj9hk" Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.851837 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97af4546-9849-4500-a18e-994ec8158af0-utilities\") pod \"redhat-operators-zj9hk\" (UID: \"97af4546-9849-4500-a18e-994ec8158af0\") " pod="openshift-marketplace/redhat-operators-zj9hk" Sep 30 07:36:06 crc kubenswrapper[4760]: I0930 07:36:06.871986 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9242t\" (UniqueName: \"kubernetes.io/projected/97af4546-9849-4500-a18e-994ec8158af0-kube-api-access-9242t\") pod \"redhat-operators-zj9hk\" (UID: \"97af4546-9849-4500-a18e-994ec8158af0\") " pod="openshift-marketplace/redhat-operators-zj9hk" Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.030928 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zj9hk" Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.109399 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9mtn4"] Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.111848 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9mtn4" Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.119763 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9mtn4"] Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.157792 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk2pb\" (UniqueName: \"kubernetes.io/projected/ebfeeebf-0ff3-4f39-a225-4bbef7a61bee-kube-api-access-tk2pb\") pod \"redhat-operators-9mtn4\" (UID: \"ebfeeebf-0ff3-4f39-a225-4bbef7a61bee\") " pod="openshift-marketplace/redhat-operators-9mtn4" Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.157852 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebfeeebf-0ff3-4f39-a225-4bbef7a61bee-utilities\") pod \"redhat-operators-9mtn4\" (UID: \"ebfeeebf-0ff3-4f39-a225-4bbef7a61bee\") " pod="openshift-marketplace/redhat-operators-9mtn4" Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.157900 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebfeeebf-0ff3-4f39-a225-4bbef7a61bee-catalog-content\") pod \"redhat-operators-9mtn4\" (UID: \"ebfeeebf-0ff3-4f39-a225-4bbef7a61bee\") " pod="openshift-marketplace/redhat-operators-9mtn4" Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.259522 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk2pb\" (UniqueName: \"kubernetes.io/projected/ebfeeebf-0ff3-4f39-a225-4bbef7a61bee-kube-api-access-tk2pb\") pod \"redhat-operators-9mtn4\" (UID: \"ebfeeebf-0ff3-4f39-a225-4bbef7a61bee\") " pod="openshift-marketplace/redhat-operators-9mtn4" Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.259581 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebfeeebf-0ff3-4f39-a225-4bbef7a61bee-utilities\") pod \"redhat-operators-9mtn4\" (UID: \"ebfeeebf-0ff3-4f39-a225-4bbef7a61bee\") " pod="openshift-marketplace/redhat-operators-9mtn4" Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.259620 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebfeeebf-0ff3-4f39-a225-4bbef7a61bee-catalog-content\") pod \"redhat-operators-9mtn4\" (UID: \"ebfeeebf-0ff3-4f39-a225-4bbef7a61bee\") " pod="openshift-marketplace/redhat-operators-9mtn4" Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.260056 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebfeeebf-0ff3-4f39-a225-4bbef7a61bee-catalog-content\") pod \"redhat-operators-9mtn4\" (UID: \"ebfeeebf-0ff3-4f39-a225-4bbef7a61bee\") " pod="openshift-marketplace/redhat-operators-9mtn4" Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.260724 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebfeeebf-0ff3-4f39-a225-4bbef7a61bee-utilities\") pod \"redhat-operators-9mtn4\" (UID: \"ebfeeebf-0ff3-4f39-a225-4bbef7a61bee\") " pod="openshift-marketplace/redhat-operators-9mtn4" Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.282182 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk2pb\" (UniqueName: \"kubernetes.io/projected/ebfeeebf-0ff3-4f39-a225-4bbef7a61bee-kube-api-access-tk2pb\") pod \"redhat-operators-9mtn4\" (UID: \"ebfeeebf-0ff3-4f39-a225-4bbef7a61bee\") " pod="openshift-marketplace/redhat-operators-9mtn4" Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.292063 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zj9hk"] Sep 30 07:36:07 crc kubenswrapper[4760]: W0930 07:36:07.327166 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97af4546_9849_4500_a18e_994ec8158af0.slice/crio-12ef421c2b85f611b54f842e529425bbc4f7ea1c13cbc68d02242a5c9044d83f WatchSource:0}: Error finding container 12ef421c2b85f611b54f842e529425bbc4f7ea1c13cbc68d02242a5c9044d83f: Status 404 returned error can't find the container with id 12ef421c2b85f611b54f842e529425bbc4f7ea1c13cbc68d02242a5c9044d83f Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.371063 4760 generic.go:334] "Generic (PLEG): container finished" podID="36f4e442-c66f-4b6b-b6b3-98c8fd676ba7" containerID="a3ac86061330ee8cb670dbcc161921fa74a3ec58b68857906a49a40e87f3d895" exitCode=0 Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.371140 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dp82r" event={"ID":"36f4e442-c66f-4b6b-b6b3-98c8fd676ba7","Type":"ContainerDied","Data":"a3ac86061330ee8cb670dbcc161921fa74a3ec58b68857906a49a40e87f3d895"} Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.372487 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.372519 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.373243 4760 patch_prober.go:28] interesting pod/console-f9d7485db-ntgr2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.373299 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-ntgr2" podUID="08d362f3-5c04-45fe-9981-ada11b028f83" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.376845 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zj9hk" event={"ID":"97af4546-9849-4500-a18e-994ec8158af0","Type":"ContainerStarted","Data":"12ef421c2b85f611b54f842e529425bbc4f7ea1c13cbc68d02242a5c9044d83f"} Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.380074 4760 generic.go:334] "Generic (PLEG): container finished" podID="8a700923-3c70-481e-9fad-c9de1e186301" containerID="295bbe9be0ddea029ca918ff33b3d22fd1b078b8687add4e6f732f50683b0b72" exitCode=0 Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.380195 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xk9tc" event={"ID":"8a700923-3c70-481e-9fad-c9de1e186301","Type":"ContainerDied","Data":"295bbe9be0ddea029ca918ff33b3d22fd1b078b8687add4e6f732f50683b0b72"} Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.380247 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xk9tc" event={"ID":"8a700923-3c70-481e-9fad-c9de1e186301","Type":"ContainerStarted","Data":"ff46fd9c4a775059aefdd3b5d98c1f9d730c2f326eaf51daf62496430ab2a466"} Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.439247 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9mtn4" Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.587678 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-vjd5w" Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.590581 4760 patch_prober.go:28] interesting pod/router-default-5444994796-vjd5w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 07:36:07 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Sep 30 07:36:07 crc kubenswrapper[4760]: [+]process-running ok Sep 30 07:36:07 crc kubenswrapper[4760]: healthz check failed Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.590632 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vjd5w" podUID="eb4bc73c-b474-4efa-b348-63ff26045c24" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.689802 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.764399 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3c080c6f-c01c-4838-bc54-69cfae495f77-kubelet-dir\") pod \"3c080c6f-c01c-4838-bc54-69cfae495f77\" (UID: \"3c080c6f-c01c-4838-bc54-69cfae495f77\") " Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.764537 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3c080c6f-c01c-4838-bc54-69cfae495f77-kube-api-access\") pod \"3c080c6f-c01c-4838-bc54-69cfae495f77\" (UID: \"3c080c6f-c01c-4838-bc54-69cfae495f77\") " Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.764523 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c080c6f-c01c-4838-bc54-69cfae495f77-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3c080c6f-c01c-4838-bc54-69cfae495f77" (UID: "3c080c6f-c01c-4838-bc54-69cfae495f77"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.764845 4760 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3c080c6f-c01c-4838-bc54-69cfae495f77-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.787553 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c080c6f-c01c-4838-bc54-69cfae495f77-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3c080c6f-c01c-4838-bc54-69cfae495f77" (UID: "3c080c6f-c01c-4838-bc54-69cfae495f77"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.807925 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-jx7lt container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.807998 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-jx7lt" podUID="3b428f41-cff0-421d-a763-987d15be26eb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.807925 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-jx7lt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.808242 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jx7lt" podUID="3b428f41-cff0-421d-a763-987d15be26eb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.829048 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9mtn4"] Sep 30 07:36:07 crc kubenswrapper[4760]: I0930 07:36:07.866699 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3c080c6f-c01c-4838-bc54-69cfae495f77-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 07:36:08 crc kubenswrapper[4760]: I0930 07:36:08.404490 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 07:36:08 crc kubenswrapper[4760]: I0930 07:36:08.404496 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3c080c6f-c01c-4838-bc54-69cfae495f77","Type":"ContainerDied","Data":"35844374756cf22593ca36b43bb808b1f7a725e40327a3f8d2e127b702c3b855"} Sep 30 07:36:08 crc kubenswrapper[4760]: I0930 07:36:08.404530 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35844374756cf22593ca36b43bb808b1f7a725e40327a3f8d2e127b702c3b855" Sep 30 07:36:08 crc kubenswrapper[4760]: I0930 07:36:08.410501 4760 generic.go:334] "Generic (PLEG): container finished" podID="ebfeeebf-0ff3-4f39-a225-4bbef7a61bee" containerID="ffdb865142dfd2c237049f2e2e7a02169957aacc9694c7747e58207eab063e7b" exitCode=0 Sep 30 07:36:08 crc kubenswrapper[4760]: I0930 07:36:08.410610 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mtn4" event={"ID":"ebfeeebf-0ff3-4f39-a225-4bbef7a61bee","Type":"ContainerDied","Data":"ffdb865142dfd2c237049f2e2e7a02169957aacc9694c7747e58207eab063e7b"} Sep 30 07:36:08 crc kubenswrapper[4760]: I0930 07:36:08.410639 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mtn4" event={"ID":"ebfeeebf-0ff3-4f39-a225-4bbef7a61bee","Type":"ContainerStarted","Data":"6fc3a4c83771cfa1179dba9390d2fedf329c77b6bd5e0bcea4fce9b7600e8da8"} Sep 30 07:36:08 crc kubenswrapper[4760]: I0930 07:36:08.415289 4760 generic.go:334] "Generic (PLEG): container finished" podID="97af4546-9849-4500-a18e-994ec8158af0" containerID="af92c8ab55f3635fe1254979c54172fcd2bd36edbf77421c77b3a223c0e27aa2" exitCode=0 Sep 30 07:36:08 crc kubenswrapper[4760]: I0930 07:36:08.415353 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zj9hk" event={"ID":"97af4546-9849-4500-a18e-994ec8158af0","Type":"ContainerDied","Data":"af92c8ab55f3635fe1254979c54172fcd2bd36edbf77421c77b3a223c0e27aa2"} Sep 30 07:36:08 crc kubenswrapper[4760]: I0930 07:36:08.589171 4760 patch_prober.go:28] interesting pod/router-default-5444994796-vjd5w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 07:36:08 crc kubenswrapper[4760]: [+]has-synced ok Sep 30 07:36:08 crc kubenswrapper[4760]: [+]process-running ok Sep 30 07:36:08 crc kubenswrapper[4760]: healthz check failed Sep 30 07:36:08 crc kubenswrapper[4760]: I0930 07:36:08.589246 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vjd5w" podUID="eb4bc73c-b474-4efa-b348-63ff26045c24" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 07:36:09 crc kubenswrapper[4760]: I0930 07:36:09.327035 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 07:36:09 crc kubenswrapper[4760]: E0930 07:36:09.327450 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c080c6f-c01c-4838-bc54-69cfae495f77" containerName="pruner" Sep 30 07:36:09 crc kubenswrapper[4760]: I0930 07:36:09.327462 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c080c6f-c01c-4838-bc54-69cfae495f77" containerName="pruner" Sep 30 07:36:09 crc kubenswrapper[4760]: I0930 07:36:09.327571 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c080c6f-c01c-4838-bc54-69cfae495f77" containerName="pruner" Sep 30 07:36:09 crc kubenswrapper[4760]: I0930 07:36:09.327890 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 07:36:09 crc kubenswrapper[4760]: I0930 07:36:09.329639 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Sep 30 07:36:09 crc kubenswrapper[4760]: I0930 07:36:09.330176 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Sep 30 07:36:09 crc kubenswrapper[4760]: I0930 07:36:09.340394 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 07:36:09 crc kubenswrapper[4760]: I0930 07:36:09.384716 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d9daed2-2539-4be9-995a-0972b45d9b96-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7d9daed2-2539-4be9-995a-0972b45d9b96\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 07:36:09 crc kubenswrapper[4760]: I0930 07:36:09.384814 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7d9daed2-2539-4be9-995a-0972b45d9b96-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7d9daed2-2539-4be9-995a-0972b45d9b96\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 07:36:09 crc kubenswrapper[4760]: I0930 07:36:09.495970 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7d9daed2-2539-4be9-995a-0972b45d9b96-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7d9daed2-2539-4be9-995a-0972b45d9b96\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 07:36:09 crc kubenswrapper[4760]: I0930 07:36:09.496052 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d9daed2-2539-4be9-995a-0972b45d9b96-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7d9daed2-2539-4be9-995a-0972b45d9b96\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 07:36:09 crc kubenswrapper[4760]: I0930 07:36:09.496495 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7d9daed2-2539-4be9-995a-0972b45d9b96-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7d9daed2-2539-4be9-995a-0972b45d9b96\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 07:36:09 crc kubenswrapper[4760]: I0930 07:36:09.513768 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d9daed2-2539-4be9-995a-0972b45d9b96-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7d9daed2-2539-4be9-995a-0972b45d9b96\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 07:36:09 crc kubenswrapper[4760]: I0930 07:36:09.589164 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-vjd5w" Sep 30 07:36:09 crc kubenswrapper[4760]: I0930 07:36:09.591381 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-vjd5w" Sep 30 07:36:09 crc kubenswrapper[4760]: I0930 07:36:09.649015 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 07:36:10 crc kubenswrapper[4760]: I0930 07:36:10.234287 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-76w2l" Sep 30 07:36:17 crc kubenswrapper[4760]: I0930 07:36:17.377400 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 07:36:17 crc kubenswrapper[4760]: I0930 07:36:17.378788 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:36:17 crc kubenswrapper[4760]: I0930 07:36:17.383146 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:36:17 crc kubenswrapper[4760]: I0930 07:36:17.809360 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-jx7lt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Sep 30 07:36:17 crc kubenswrapper[4760]: I0930 07:36:17.809411 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-jx7lt container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Sep 30 07:36:17 crc kubenswrapper[4760]: I0930 07:36:17.809443 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jx7lt" podUID="3b428f41-cff0-421d-a763-987d15be26eb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Sep 30 07:36:17 crc kubenswrapper[4760]: I0930 07:36:17.809474 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-jx7lt" podUID="3b428f41-cff0-421d-a763-987d15be26eb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Sep 30 07:36:19 crc kubenswrapper[4760]: I0930 07:36:19.019670 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-metrics-certs\") pod \"network-metrics-daemon-wv8fz\" (UID: \"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\") " pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:36:19 crc kubenswrapper[4760]: I0930 07:36:19.029935 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce6dcf25-c8ea-450b-9fc6-9f8aeafde757-metrics-certs\") pod \"network-metrics-daemon-wv8fz\" (UID: \"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757\") " pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:36:19 crc kubenswrapper[4760]: I0930 07:36:19.113662 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:36:19 crc kubenswrapper[4760]: I0930 07:36:19.113735 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:36:19 crc kubenswrapper[4760]: I0930 07:36:19.309600 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wv8fz" Sep 30 07:36:23 crc kubenswrapper[4760]: I0930 07:36:23.556362 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:36:24 crc kubenswrapper[4760]: I0930 07:36:24.519259 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7d9daed2-2539-4be9-995a-0972b45d9b96","Type":"ContainerStarted","Data":"d8abe17d9ec25397cd743ab8d0f3ac2246a9ffa235bd1d4f5a515a7dc333ea06"} Sep 30 07:36:27 crc kubenswrapper[4760]: I0930 07:36:27.827876 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-jx7lt" Sep 30 07:36:28 crc kubenswrapper[4760]: I0930 07:36:28.376086 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wv8fz"] Sep 30 07:36:28 crc kubenswrapper[4760]: I0930 07:36:28.539733 4760 generic.go:334] "Generic (PLEG): container finished" podID="ffeadfaf-9c4c-4dce-99d5-36f0e42571d7" containerID="10ade4ac36cd7b622e165a9e5f47a830f2068f65caec2b544f4f87c9a33f7476" exitCode=0 Sep 30 07:36:28 crc kubenswrapper[4760]: I0930 07:36:28.539803 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52kxf" event={"ID":"ffeadfaf-9c4c-4dce-99d5-36f0e42571d7","Type":"ContainerDied","Data":"10ade4ac36cd7b622e165a9e5f47a830f2068f65caec2b544f4f87c9a33f7476"} Sep 30 07:36:28 crc kubenswrapper[4760]: I0930 07:36:28.541569 4760 generic.go:334] "Generic (PLEG): container finished" podID="58cb5713-4587-4909-9aaa-5eae3a314c9e" containerID="95d45ba661a925bc72a4652aa7a0090394b0f4463f5f57da234ae94aeccac51f" exitCode=0 Sep 30 07:36:28 crc kubenswrapper[4760]: I0930 07:36:28.541621 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s4pkh" event={"ID":"58cb5713-4587-4909-9aaa-5eae3a314c9e","Type":"ContainerDied","Data":"95d45ba661a925bc72a4652aa7a0090394b0f4463f5f57da234ae94aeccac51f"} Sep 30 07:36:28 crc kubenswrapper[4760]: I0930 07:36:28.544269 4760 generic.go:334] "Generic (PLEG): container finished" podID="36f4e442-c66f-4b6b-b6b3-98c8fd676ba7" containerID="800a71bc090c3629094e4cb65a311961134bb056219144c4d8202dfe8705a0fd" exitCode=0 Sep 30 07:36:28 crc kubenswrapper[4760]: I0930 07:36:28.544375 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dp82r" event={"ID":"36f4e442-c66f-4b6b-b6b3-98c8fd676ba7","Type":"ContainerDied","Data":"800a71bc090c3629094e4cb65a311961134bb056219144c4d8202dfe8705a0fd"} Sep 30 07:36:28 crc kubenswrapper[4760]: I0930 07:36:28.547743 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7d9daed2-2539-4be9-995a-0972b45d9b96","Type":"ContainerStarted","Data":"8af4f6d1ae812c4094b3e1cf2de6882711648db84eb061b6725372f33317f301"} Sep 30 07:36:28 crc kubenswrapper[4760]: I0930 07:36:28.548774 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wv8fz" event={"ID":"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757","Type":"ContainerStarted","Data":"76b9236e16bfc6ac6e26a45f965e81b0e8bf5348468eb990e823bd962bd14e17"} Sep 30 07:36:28 crc kubenswrapper[4760]: I0930 07:36:28.553790 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zj9hk" event={"ID":"97af4546-9849-4500-a18e-994ec8158af0","Type":"ContainerStarted","Data":"3c51d7ef70bb13e88de80fa08e0034ac694e591f9853db2daf5162585be5467c"} Sep 30 07:36:28 crc kubenswrapper[4760]: I0930 07:36:28.566128 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mtn4" event={"ID":"ebfeeebf-0ff3-4f39-a225-4bbef7a61bee","Type":"ContainerStarted","Data":"c2a1ed5135956e36ec1cfa4cf372ec13c1db47a9f6a9758287020e0b798448b3"} Sep 30 07:36:28 crc kubenswrapper[4760]: I0930 07:36:28.568560 4760 generic.go:334] "Generic (PLEG): container finished" podID="a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc" containerID="7207c5bbb51eb7ff6fa7ed6241d558cbd698d0ef8e53002ad1a6b4880fad525e" exitCode=0 Sep 30 07:36:28 crc kubenswrapper[4760]: I0930 07:36:28.568667 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjdgw" event={"ID":"a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc","Type":"ContainerDied","Data":"7207c5bbb51eb7ff6fa7ed6241d558cbd698d0ef8e53002ad1a6b4880fad525e"} Sep 30 07:36:28 crc kubenswrapper[4760]: I0930 07:36:28.572141 4760 generic.go:334] "Generic (PLEG): container finished" podID="93384c2b-6a0e-41e8-a873-501fb43090a5" containerID="f5d7ed56864a9b67d94649dc3780105e1b394f238e9e5cd472c772ab93537cbb" exitCode=0 Sep 30 07:36:28 crc kubenswrapper[4760]: I0930 07:36:28.572399 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8f8g7" event={"ID":"93384c2b-6a0e-41e8-a873-501fb43090a5","Type":"ContainerDied","Data":"f5d7ed56864a9b67d94649dc3780105e1b394f238e9e5cd472c772ab93537cbb"} Sep 30 07:36:28 crc kubenswrapper[4760]: I0930 07:36:28.574768 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=19.57475446 podStartE2EDuration="19.57475446s" podCreationTimestamp="2025-09-30 07:36:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:28.573922166 +0000 UTC m=+174.216828568" watchObservedRunningTime="2025-09-30 07:36:28.57475446 +0000 UTC m=+174.217660872" Sep 30 07:36:28 crc kubenswrapper[4760]: I0930 07:36:28.596601 4760 generic.go:334] "Generic (PLEG): container finished" podID="8a700923-3c70-481e-9fad-c9de1e186301" containerID="6d534bae1200474d27ce2e99c1355fd36fe30c49b486f8ad53f35ab80d4368ad" exitCode=0 Sep 30 07:36:28 crc kubenswrapper[4760]: I0930 07:36:28.596651 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xk9tc" event={"ID":"8a700923-3c70-481e-9fad-c9de1e186301","Type":"ContainerDied","Data":"6d534bae1200474d27ce2e99c1355fd36fe30c49b486f8ad53f35ab80d4368ad"} Sep 30 07:36:29 crc kubenswrapper[4760]: I0930 07:36:29.607698 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wv8fz" event={"ID":"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757","Type":"ContainerStarted","Data":"4c6c88ecce7591a2d18eb4845d059088965c02392a2b1d495cd2f905b048afa1"} Sep 30 07:36:29 crc kubenswrapper[4760]: I0930 07:36:29.608124 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wv8fz" event={"ID":"ce6dcf25-c8ea-450b-9fc6-9f8aeafde757","Type":"ContainerStarted","Data":"3ca459506b9edb4d7ca91dcfa3ed344552488aab292e3d962d771ecef9b669a2"} Sep 30 07:36:29 crc kubenswrapper[4760]: I0930 07:36:29.613720 4760 generic.go:334] "Generic (PLEG): container finished" podID="97af4546-9849-4500-a18e-994ec8158af0" containerID="3c51d7ef70bb13e88de80fa08e0034ac694e591f9853db2daf5162585be5467c" exitCode=0 Sep 30 07:36:29 crc kubenswrapper[4760]: I0930 07:36:29.613765 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zj9hk" event={"ID":"97af4546-9849-4500-a18e-994ec8158af0","Type":"ContainerDied","Data":"3c51d7ef70bb13e88de80fa08e0034ac694e591f9853db2daf5162585be5467c"} Sep 30 07:36:29 crc kubenswrapper[4760]: I0930 07:36:29.619044 4760 generic.go:334] "Generic (PLEG): container finished" podID="ebfeeebf-0ff3-4f39-a225-4bbef7a61bee" containerID="c2a1ed5135956e36ec1cfa4cf372ec13c1db47a9f6a9758287020e0b798448b3" exitCode=0 Sep 30 07:36:29 crc kubenswrapper[4760]: I0930 07:36:29.619175 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mtn4" event={"ID":"ebfeeebf-0ff3-4f39-a225-4bbef7a61bee","Type":"ContainerDied","Data":"c2a1ed5135956e36ec1cfa4cf372ec13c1db47a9f6a9758287020e0b798448b3"} Sep 30 07:36:29 crc kubenswrapper[4760]: I0930 07:36:29.623107 4760 generic.go:334] "Generic (PLEG): container finished" podID="7d9daed2-2539-4be9-995a-0972b45d9b96" containerID="8af4f6d1ae812c4094b3e1cf2de6882711648db84eb061b6725372f33317f301" exitCode=0 Sep 30 07:36:29 crc kubenswrapper[4760]: I0930 07:36:29.623152 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7d9daed2-2539-4be9-995a-0972b45d9b96","Type":"ContainerDied","Data":"8af4f6d1ae812c4094b3e1cf2de6882711648db84eb061b6725372f33317f301"} Sep 30 07:36:30 crc kubenswrapper[4760]: I0930 07:36:30.629440 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xk9tc" event={"ID":"8a700923-3c70-481e-9fad-c9de1e186301","Type":"ContainerStarted","Data":"af6ca898785454667dfecd70ae70a919e4225eb67700a7b12dd2a32fb811ea24"} Sep 30 07:36:30 crc kubenswrapper[4760]: I0930 07:36:30.631863 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dp82r" event={"ID":"36f4e442-c66f-4b6b-b6b3-98c8fd676ba7","Type":"ContainerStarted","Data":"90b088c8681826e2d5b74cf2e1c2733f3398f9126ac83ca553f04a107c6948be"} Sep 30 07:36:30 crc kubenswrapper[4760]: I0930 07:36:30.633629 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zj9hk" event={"ID":"97af4546-9849-4500-a18e-994ec8158af0","Type":"ContainerStarted","Data":"0d990bd19eafbf236c77646298fac63e37e3328e111bc2ed19e935062999d069"} Sep 30 07:36:30 crc kubenswrapper[4760]: I0930 07:36:30.635197 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mtn4" event={"ID":"ebfeeebf-0ff3-4f39-a225-4bbef7a61bee","Type":"ContainerStarted","Data":"0be3a731f5eea49e31df742364cd070d4f7a0797f1c7543c4e575cc2bd3c4609"} Sep 30 07:36:30 crc kubenswrapper[4760]: I0930 07:36:30.636785 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52kxf" event={"ID":"ffeadfaf-9c4c-4dce-99d5-36f0e42571d7","Type":"ContainerStarted","Data":"503d47c18b5bbd7eb35699ca96edf4c3989e91f6e3946b2c68e92e5594d1c2ab"} Sep 30 07:36:30 crc kubenswrapper[4760]: I0930 07:36:30.638328 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjdgw" event={"ID":"a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc","Type":"ContainerStarted","Data":"8dd1d07f8be35389c3eed219a69f21128a2506ffc0d49eea49d0075d7f69a8d8"} Sep 30 07:36:30 crc kubenswrapper[4760]: I0930 07:36:30.639825 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s4pkh" event={"ID":"58cb5713-4587-4909-9aaa-5eae3a314c9e","Type":"ContainerStarted","Data":"3a2bd664924e45a33a85207cc636ccf59ecdb0d7f0016ce7db5162d88b85affb"} Sep 30 07:36:30 crc kubenswrapper[4760]: I0930 07:36:30.641407 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8f8g7" event={"ID":"93384c2b-6a0e-41e8-a873-501fb43090a5","Type":"ContainerStarted","Data":"6cd8db9f360884e8ba75357abab181841f8828804da237f8dbf42a32432e1254"} Sep 30 07:36:30 crc kubenswrapper[4760]: I0930 07:36:30.653396 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xk9tc" podStartSLOduration=3.05253204 podStartE2EDuration="25.653380112s" podCreationTimestamp="2025-09-30 07:36:05 +0000 UTC" firstStartedPulling="2025-09-30 07:36:07.400231847 +0000 UTC m=+153.043138259" lastFinishedPulling="2025-09-30 07:36:30.001079849 +0000 UTC m=+175.643986331" observedRunningTime="2025-09-30 07:36:30.652489916 +0000 UTC m=+176.295396328" watchObservedRunningTime="2025-09-30 07:36:30.653380112 +0000 UTC m=+176.296286524" Sep 30 07:36:30 crc kubenswrapper[4760]: I0930 07:36:30.697242 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tjdgw" podStartSLOduration=1.9681276680000002 podStartE2EDuration="27.697222959s" podCreationTimestamp="2025-09-30 07:36:03 +0000 UTC" firstStartedPulling="2025-09-30 07:36:04.288551573 +0000 UTC m=+149.931457985" lastFinishedPulling="2025-09-30 07:36:30.017646864 +0000 UTC m=+175.660553276" observedRunningTime="2025-09-30 07:36:30.676438443 +0000 UTC m=+176.319344865" watchObservedRunningTime="2025-09-30 07:36:30.697222959 +0000 UTC m=+176.340129361" Sep 30 07:36:30 crc kubenswrapper[4760]: I0930 07:36:30.697907 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8f8g7" podStartSLOduration=2.079579197 podStartE2EDuration="26.697898648s" podCreationTimestamp="2025-09-30 07:36:04 +0000 UTC" firstStartedPulling="2025-09-30 07:36:05.324765675 +0000 UTC m=+150.967672087" lastFinishedPulling="2025-09-30 07:36:29.943085096 +0000 UTC m=+175.585991538" observedRunningTime="2025-09-30 07:36:30.696977332 +0000 UTC m=+176.339883744" watchObservedRunningTime="2025-09-30 07:36:30.697898648 +0000 UTC m=+176.340805060" Sep 30 07:36:30 crc kubenswrapper[4760]: I0930 07:36:30.724825 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zj9hk" podStartSLOduration=2.892115704 podStartE2EDuration="24.72480819s" podCreationTimestamp="2025-09-30 07:36:06 +0000 UTC" firstStartedPulling="2025-09-30 07:36:08.419907205 +0000 UTC m=+154.062813617" lastFinishedPulling="2025-09-30 07:36:30.252599691 +0000 UTC m=+175.895506103" observedRunningTime="2025-09-30 07:36:30.720731073 +0000 UTC m=+176.363637495" watchObservedRunningTime="2025-09-30 07:36:30.72480819 +0000 UTC m=+176.367714602" Sep 30 07:36:30 crc kubenswrapper[4760]: I0930 07:36:30.738963 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dp82r" podStartSLOduration=3.272489626 podStartE2EDuration="25.738947485s" podCreationTimestamp="2025-09-30 07:36:05 +0000 UTC" firstStartedPulling="2025-09-30 07:36:07.375730434 +0000 UTC m=+153.018636846" lastFinishedPulling="2025-09-30 07:36:29.842188253 +0000 UTC m=+175.485094705" observedRunningTime="2025-09-30 07:36:30.735199928 +0000 UTC m=+176.378106350" watchObservedRunningTime="2025-09-30 07:36:30.738947485 +0000 UTC m=+176.381853897" Sep 30 07:36:30 crc kubenswrapper[4760]: I0930 07:36:30.756089 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9mtn4" podStartSLOduration=1.880332547 podStartE2EDuration="23.756073756s" podCreationTimestamp="2025-09-30 07:36:07 +0000 UTC" firstStartedPulling="2025-09-30 07:36:08.42042959 +0000 UTC m=+154.063336002" lastFinishedPulling="2025-09-30 07:36:30.296170799 +0000 UTC m=+175.939077211" observedRunningTime="2025-09-30 07:36:30.753606335 +0000 UTC m=+176.396512747" watchObservedRunningTime="2025-09-30 07:36:30.756073756 +0000 UTC m=+176.398980168" Sep 30 07:36:30 crc kubenswrapper[4760]: I0930 07:36:30.775529 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-52kxf" podStartSLOduration=3.303066006 podStartE2EDuration="27.775512254s" podCreationTimestamp="2025-09-30 07:36:03 +0000 UTC" firstStartedPulling="2025-09-30 07:36:05.337771898 +0000 UTC m=+150.980678310" lastFinishedPulling="2025-09-30 07:36:29.810218116 +0000 UTC m=+175.453124558" observedRunningTime="2025-09-30 07:36:30.77295182 +0000 UTC m=+176.415858242" watchObservedRunningTime="2025-09-30 07:36:30.775512254 +0000 UTC m=+176.418418666" Sep 30 07:36:30 crc kubenswrapper[4760]: I0930 07:36:30.819294 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s4pkh" podStartSLOduration=2.9615191210000003 podStartE2EDuration="27.819273468s" podCreationTimestamp="2025-09-30 07:36:03 +0000 UTC" firstStartedPulling="2025-09-30 07:36:05.349683209 +0000 UTC m=+150.992589621" lastFinishedPulling="2025-09-30 07:36:30.207437556 +0000 UTC m=+175.850343968" observedRunningTime="2025-09-30 07:36:30.792438699 +0000 UTC m=+176.435345121" watchObservedRunningTime="2025-09-30 07:36:30.819273468 +0000 UTC m=+176.462179880" Sep 30 07:36:30 crc kubenswrapper[4760]: I0930 07:36:30.827742 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wv8fz" podStartSLOduration=155.827720981 podStartE2EDuration="2m35.827720981s" podCreationTimestamp="2025-09-30 07:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:36:30.819013651 +0000 UTC m=+176.461920063" watchObservedRunningTime="2025-09-30 07:36:30.827720981 +0000 UTC m=+176.470627393" Sep 30 07:36:31 crc kubenswrapper[4760]: I0930 07:36:31.061728 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 07:36:31 crc kubenswrapper[4760]: I0930 07:36:31.217371 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d9daed2-2539-4be9-995a-0972b45d9b96-kube-api-access\") pod \"7d9daed2-2539-4be9-995a-0972b45d9b96\" (UID: \"7d9daed2-2539-4be9-995a-0972b45d9b96\") " Sep 30 07:36:31 crc kubenswrapper[4760]: I0930 07:36:31.217465 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7d9daed2-2539-4be9-995a-0972b45d9b96-kubelet-dir\") pod \"7d9daed2-2539-4be9-995a-0972b45d9b96\" (UID: \"7d9daed2-2539-4be9-995a-0972b45d9b96\") " Sep 30 07:36:31 crc kubenswrapper[4760]: I0930 07:36:31.217601 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d9daed2-2539-4be9-995a-0972b45d9b96-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7d9daed2-2539-4be9-995a-0972b45d9b96" (UID: "7d9daed2-2539-4be9-995a-0972b45d9b96"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:36:31 crc kubenswrapper[4760]: I0930 07:36:31.217851 4760 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7d9daed2-2539-4be9-995a-0972b45d9b96-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 30 07:36:31 crc kubenswrapper[4760]: I0930 07:36:31.230042 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d9daed2-2539-4be9-995a-0972b45d9b96-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7d9daed2-2539-4be9-995a-0972b45d9b96" (UID: "7d9daed2-2539-4be9-995a-0972b45d9b96"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:36:31 crc kubenswrapper[4760]: I0930 07:36:31.318922 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d9daed2-2539-4be9-995a-0972b45d9b96-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 07:36:31 crc kubenswrapper[4760]: I0930 07:36:31.646449 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7d9daed2-2539-4be9-995a-0972b45d9b96","Type":"ContainerDied","Data":"d8abe17d9ec25397cd743ab8d0f3ac2246a9ffa235bd1d4f5a515a7dc333ea06"} Sep 30 07:36:31 crc kubenswrapper[4760]: I0930 07:36:31.646491 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8abe17d9ec25397cd743ab8d0f3ac2246a9ffa235bd1d4f5a515a7dc333ea06" Sep 30 07:36:31 crc kubenswrapper[4760]: I0930 07:36:31.646498 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 07:36:33 crc kubenswrapper[4760]: I0930 07:36:33.198641 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 07:36:33 crc kubenswrapper[4760]: I0930 07:36:33.836557 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tjdgw" Sep 30 07:36:33 crc kubenswrapper[4760]: I0930 07:36:33.836652 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tjdgw" Sep 30 07:36:34 crc kubenswrapper[4760]: I0930 07:36:34.048634 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-52kxf" Sep 30 07:36:34 crc kubenswrapper[4760]: I0930 07:36:34.048683 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-52kxf" Sep 30 07:36:34 crc kubenswrapper[4760]: I0930 07:36:34.105138 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tjdgw" Sep 30 07:36:34 crc kubenswrapper[4760]: I0930 07:36:34.106420 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-52kxf" Sep 30 07:36:34 crc kubenswrapper[4760]: I0930 07:36:34.240643 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s4pkh" Sep 30 07:36:34 crc kubenswrapper[4760]: I0930 07:36:34.240692 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s4pkh" Sep 30 07:36:34 crc kubenswrapper[4760]: I0930 07:36:34.321360 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s4pkh" Sep 30 07:36:34 crc kubenswrapper[4760]: I0930 07:36:34.422960 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8f8g7" Sep 30 07:36:34 crc kubenswrapper[4760]: I0930 07:36:34.423022 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8f8g7" Sep 30 07:36:34 crc kubenswrapper[4760]: I0930 07:36:34.477376 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8f8g7" Sep 30 07:36:35 crc kubenswrapper[4760]: I0930 07:36:35.884113 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dp82r" Sep 30 07:36:35 crc kubenswrapper[4760]: I0930 07:36:35.884419 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dp82r" Sep 30 07:36:35 crc kubenswrapper[4760]: I0930 07:36:35.922959 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dp82r" Sep 30 07:36:36 crc kubenswrapper[4760]: I0930 07:36:36.298225 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xk9tc" Sep 30 07:36:36 crc kubenswrapper[4760]: I0930 07:36:36.298707 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xk9tc" Sep 30 07:36:36 crc kubenswrapper[4760]: I0930 07:36:36.369192 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xk9tc" Sep 30 07:36:36 crc kubenswrapper[4760]: I0930 07:36:36.734909 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xk9tc" Sep 30 07:36:36 crc kubenswrapper[4760]: I0930 07:36:36.747032 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dp82r" Sep 30 07:36:37 crc kubenswrapper[4760]: I0930 07:36:37.031057 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zj9hk" Sep 30 07:36:37 crc kubenswrapper[4760]: I0930 07:36:37.031366 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zj9hk" Sep 30 07:36:37 crc kubenswrapper[4760]: I0930 07:36:37.082199 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zj9hk" Sep 30 07:36:37 crc kubenswrapper[4760]: I0930 07:36:37.440623 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9mtn4" Sep 30 07:36:37 crc kubenswrapper[4760]: I0930 07:36:37.440997 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9mtn4" Sep 30 07:36:37 crc kubenswrapper[4760]: I0930 07:36:37.484922 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9mtn4" Sep 30 07:36:37 crc kubenswrapper[4760]: I0930 07:36:37.748178 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9mtn4" Sep 30 07:36:37 crc kubenswrapper[4760]: I0930 07:36:37.751653 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zj9hk" Sep 30 07:36:37 crc kubenswrapper[4760]: I0930 07:36:37.882522 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9gnwv" Sep 30 07:36:38 crc kubenswrapper[4760]: I0930 07:36:38.970802 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xk9tc"] Sep 30 07:36:38 crc kubenswrapper[4760]: I0930 07:36:38.971247 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xk9tc" podUID="8a700923-3c70-481e-9fad-c9de1e186301" containerName="registry-server" containerID="cri-o://af6ca898785454667dfecd70ae70a919e4225eb67700a7b12dd2a32fb811ea24" gracePeriod=2 Sep 30 07:36:39 crc kubenswrapper[4760]: I0930 07:36:39.501802 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xk9tc" Sep 30 07:36:39 crc kubenswrapper[4760]: I0930 07:36:39.655371 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a700923-3c70-481e-9fad-c9de1e186301-utilities\") pod \"8a700923-3c70-481e-9fad-c9de1e186301\" (UID: \"8a700923-3c70-481e-9fad-c9de1e186301\") " Sep 30 07:36:39 crc kubenswrapper[4760]: I0930 07:36:39.655489 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-576xx\" (UniqueName: \"kubernetes.io/projected/8a700923-3c70-481e-9fad-c9de1e186301-kube-api-access-576xx\") pod \"8a700923-3c70-481e-9fad-c9de1e186301\" (UID: \"8a700923-3c70-481e-9fad-c9de1e186301\") " Sep 30 07:36:39 crc kubenswrapper[4760]: I0930 07:36:39.655561 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a700923-3c70-481e-9fad-c9de1e186301-catalog-content\") pod \"8a700923-3c70-481e-9fad-c9de1e186301\" (UID: \"8a700923-3c70-481e-9fad-c9de1e186301\") " Sep 30 07:36:39 crc kubenswrapper[4760]: I0930 07:36:39.656225 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a700923-3c70-481e-9fad-c9de1e186301-utilities" (OuterVolumeSpecName: "utilities") pod "8a700923-3c70-481e-9fad-c9de1e186301" (UID: "8a700923-3c70-481e-9fad-c9de1e186301"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:36:39 crc kubenswrapper[4760]: I0930 07:36:39.664940 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a700923-3c70-481e-9fad-c9de1e186301-kube-api-access-576xx" (OuterVolumeSpecName: "kube-api-access-576xx") pod "8a700923-3c70-481e-9fad-c9de1e186301" (UID: "8a700923-3c70-481e-9fad-c9de1e186301"). InnerVolumeSpecName "kube-api-access-576xx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:36:39 crc kubenswrapper[4760]: I0930 07:36:39.672844 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a700923-3c70-481e-9fad-c9de1e186301-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a700923-3c70-481e-9fad-c9de1e186301" (UID: "8a700923-3c70-481e-9fad-c9de1e186301"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:36:39 crc kubenswrapper[4760]: I0930 07:36:39.697912 4760 generic.go:334] "Generic (PLEG): container finished" podID="8a700923-3c70-481e-9fad-c9de1e186301" containerID="af6ca898785454667dfecd70ae70a919e4225eb67700a7b12dd2a32fb811ea24" exitCode=0 Sep 30 07:36:39 crc kubenswrapper[4760]: I0930 07:36:39.699100 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xk9tc" Sep 30 07:36:39 crc kubenswrapper[4760]: I0930 07:36:39.703502 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xk9tc" event={"ID":"8a700923-3c70-481e-9fad-c9de1e186301","Type":"ContainerDied","Data":"af6ca898785454667dfecd70ae70a919e4225eb67700a7b12dd2a32fb811ea24"} Sep 30 07:36:39 crc kubenswrapper[4760]: I0930 07:36:39.703559 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xk9tc" event={"ID":"8a700923-3c70-481e-9fad-c9de1e186301","Type":"ContainerDied","Data":"ff46fd9c4a775059aefdd3b5d98c1f9d730c2f326eaf51daf62496430ab2a466"} Sep 30 07:36:39 crc kubenswrapper[4760]: I0930 07:36:39.703590 4760 scope.go:117] "RemoveContainer" containerID="af6ca898785454667dfecd70ae70a919e4225eb67700a7b12dd2a32fb811ea24" Sep 30 07:36:39 crc kubenswrapper[4760]: I0930 07:36:39.751871 4760 scope.go:117] "RemoveContainer" containerID="6d534bae1200474d27ce2e99c1355fd36fe30c49b486f8ad53f35ab80d4368ad" Sep 30 07:36:39 crc kubenswrapper[4760]: I0930 07:36:39.757709 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a700923-3c70-481e-9fad-c9de1e186301-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:36:39 crc kubenswrapper[4760]: I0930 07:36:39.757762 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-576xx\" (UniqueName: \"kubernetes.io/projected/8a700923-3c70-481e-9fad-c9de1e186301-kube-api-access-576xx\") on node \"crc\" DevicePath \"\"" Sep 30 07:36:39 crc kubenswrapper[4760]: I0930 07:36:39.757799 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a700923-3c70-481e-9fad-c9de1e186301-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:36:39 crc kubenswrapper[4760]: I0930 07:36:39.762901 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xk9tc"] Sep 30 07:36:39 crc kubenswrapper[4760]: I0930 07:36:39.770665 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xk9tc"] Sep 30 07:36:39 crc kubenswrapper[4760]: I0930 07:36:39.794668 4760 scope.go:117] "RemoveContainer" containerID="295bbe9be0ddea029ca918ff33b3d22fd1b078b8687add4e6f732f50683b0b72" Sep 30 07:36:39 crc kubenswrapper[4760]: I0930 07:36:39.817121 4760 scope.go:117] "RemoveContainer" containerID="af6ca898785454667dfecd70ae70a919e4225eb67700a7b12dd2a32fb811ea24" Sep 30 07:36:39 crc kubenswrapper[4760]: E0930 07:36:39.817689 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af6ca898785454667dfecd70ae70a919e4225eb67700a7b12dd2a32fb811ea24\": container with ID starting with af6ca898785454667dfecd70ae70a919e4225eb67700a7b12dd2a32fb811ea24 not found: ID does not exist" containerID="af6ca898785454667dfecd70ae70a919e4225eb67700a7b12dd2a32fb811ea24" Sep 30 07:36:39 crc kubenswrapper[4760]: I0930 07:36:39.817727 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af6ca898785454667dfecd70ae70a919e4225eb67700a7b12dd2a32fb811ea24"} err="failed to get container status \"af6ca898785454667dfecd70ae70a919e4225eb67700a7b12dd2a32fb811ea24\": rpc error: code = NotFound desc = could not find container \"af6ca898785454667dfecd70ae70a919e4225eb67700a7b12dd2a32fb811ea24\": container with ID starting with af6ca898785454667dfecd70ae70a919e4225eb67700a7b12dd2a32fb811ea24 not found: ID does not exist" Sep 30 07:36:39 crc kubenswrapper[4760]: I0930 07:36:39.817767 4760 scope.go:117] "RemoveContainer" containerID="6d534bae1200474d27ce2e99c1355fd36fe30c49b486f8ad53f35ab80d4368ad" Sep 30 07:36:39 crc kubenswrapper[4760]: E0930 07:36:39.818239 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d534bae1200474d27ce2e99c1355fd36fe30c49b486f8ad53f35ab80d4368ad\": container with ID starting with 6d534bae1200474d27ce2e99c1355fd36fe30c49b486f8ad53f35ab80d4368ad not found: ID does not exist" containerID="6d534bae1200474d27ce2e99c1355fd36fe30c49b486f8ad53f35ab80d4368ad" Sep 30 07:36:39 crc kubenswrapper[4760]: I0930 07:36:39.818288 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d534bae1200474d27ce2e99c1355fd36fe30c49b486f8ad53f35ab80d4368ad"} err="failed to get container status \"6d534bae1200474d27ce2e99c1355fd36fe30c49b486f8ad53f35ab80d4368ad\": rpc error: code = NotFound desc = could not find container \"6d534bae1200474d27ce2e99c1355fd36fe30c49b486f8ad53f35ab80d4368ad\": container with ID starting with 6d534bae1200474d27ce2e99c1355fd36fe30c49b486f8ad53f35ab80d4368ad not found: ID does not exist" Sep 30 07:36:39 crc kubenswrapper[4760]: I0930 07:36:39.818338 4760 scope.go:117] "RemoveContainer" containerID="295bbe9be0ddea029ca918ff33b3d22fd1b078b8687add4e6f732f50683b0b72" Sep 30 07:36:39 crc kubenswrapper[4760]: E0930 07:36:39.818890 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"295bbe9be0ddea029ca918ff33b3d22fd1b078b8687add4e6f732f50683b0b72\": container with ID starting with 295bbe9be0ddea029ca918ff33b3d22fd1b078b8687add4e6f732f50683b0b72 not found: ID does not exist" containerID="295bbe9be0ddea029ca918ff33b3d22fd1b078b8687add4e6f732f50683b0b72" Sep 30 07:36:39 crc kubenswrapper[4760]: I0930 07:36:39.818910 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"295bbe9be0ddea029ca918ff33b3d22fd1b078b8687add4e6f732f50683b0b72"} err="failed to get container status \"295bbe9be0ddea029ca918ff33b3d22fd1b078b8687add4e6f732f50683b0b72\": rpc error: code = NotFound desc = could not find container \"295bbe9be0ddea029ca918ff33b3d22fd1b078b8687add4e6f732f50683b0b72\": container with ID starting with 295bbe9be0ddea029ca918ff33b3d22fd1b078b8687add4e6f732f50683b0b72 not found: ID does not exist" Sep 30 07:36:41 crc kubenswrapper[4760]: I0930 07:36:41.080263 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a700923-3c70-481e-9fad-c9de1e186301" path="/var/lib/kubelet/pods/8a700923-3c70-481e-9fad-c9de1e186301/volumes" Sep 30 07:36:41 crc kubenswrapper[4760]: I0930 07:36:41.373766 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9mtn4"] Sep 30 07:36:41 crc kubenswrapper[4760]: I0930 07:36:41.374168 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9mtn4" podUID="ebfeeebf-0ff3-4f39-a225-4bbef7a61bee" containerName="registry-server" containerID="cri-o://0be3a731f5eea49e31df742364cd070d4f7a0797f1c7543c4e575cc2bd3c4609" gracePeriod=2 Sep 30 07:36:42 crc kubenswrapper[4760]: I0930 07:36:42.723834 4760 generic.go:334] "Generic (PLEG): container finished" podID="ebfeeebf-0ff3-4f39-a225-4bbef7a61bee" containerID="0be3a731f5eea49e31df742364cd070d4f7a0797f1c7543c4e575cc2bd3c4609" exitCode=0 Sep 30 07:36:42 crc kubenswrapper[4760]: I0930 07:36:42.723903 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mtn4" event={"ID":"ebfeeebf-0ff3-4f39-a225-4bbef7a61bee","Type":"ContainerDied","Data":"0be3a731f5eea49e31df742364cd070d4f7a0797f1c7543c4e575cc2bd3c4609"} Sep 30 07:36:42 crc kubenswrapper[4760]: I0930 07:36:42.837142 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9mtn4" Sep 30 07:36:43 crc kubenswrapper[4760]: I0930 07:36:43.008855 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebfeeebf-0ff3-4f39-a225-4bbef7a61bee-utilities\") pod \"ebfeeebf-0ff3-4f39-a225-4bbef7a61bee\" (UID: \"ebfeeebf-0ff3-4f39-a225-4bbef7a61bee\") " Sep 30 07:36:43 crc kubenswrapper[4760]: I0930 07:36:43.009084 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk2pb\" (UniqueName: \"kubernetes.io/projected/ebfeeebf-0ff3-4f39-a225-4bbef7a61bee-kube-api-access-tk2pb\") pod \"ebfeeebf-0ff3-4f39-a225-4bbef7a61bee\" (UID: \"ebfeeebf-0ff3-4f39-a225-4bbef7a61bee\") " Sep 30 07:36:43 crc kubenswrapper[4760]: I0930 07:36:43.009161 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebfeeebf-0ff3-4f39-a225-4bbef7a61bee-catalog-content\") pod \"ebfeeebf-0ff3-4f39-a225-4bbef7a61bee\" (UID: \"ebfeeebf-0ff3-4f39-a225-4bbef7a61bee\") " Sep 30 07:36:43 crc kubenswrapper[4760]: I0930 07:36:43.012663 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebfeeebf-0ff3-4f39-a225-4bbef7a61bee-utilities" (OuterVolumeSpecName: "utilities") pod "ebfeeebf-0ff3-4f39-a225-4bbef7a61bee" (UID: "ebfeeebf-0ff3-4f39-a225-4bbef7a61bee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:36:43 crc kubenswrapper[4760]: I0930 07:36:43.014772 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebfeeebf-0ff3-4f39-a225-4bbef7a61bee-kube-api-access-tk2pb" (OuterVolumeSpecName: "kube-api-access-tk2pb") pod "ebfeeebf-0ff3-4f39-a225-4bbef7a61bee" (UID: "ebfeeebf-0ff3-4f39-a225-4bbef7a61bee"). InnerVolumeSpecName "kube-api-access-tk2pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:36:43 crc kubenswrapper[4760]: I0930 07:36:43.108379 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebfeeebf-0ff3-4f39-a225-4bbef7a61bee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebfeeebf-0ff3-4f39-a225-4bbef7a61bee" (UID: "ebfeeebf-0ff3-4f39-a225-4bbef7a61bee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:36:43 crc kubenswrapper[4760]: I0930 07:36:43.111013 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk2pb\" (UniqueName: \"kubernetes.io/projected/ebfeeebf-0ff3-4f39-a225-4bbef7a61bee-kube-api-access-tk2pb\") on node \"crc\" DevicePath \"\"" Sep 30 07:36:43 crc kubenswrapper[4760]: I0930 07:36:43.111066 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebfeeebf-0ff3-4f39-a225-4bbef7a61bee-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:36:43 crc kubenswrapper[4760]: I0930 07:36:43.111085 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebfeeebf-0ff3-4f39-a225-4bbef7a61bee-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:36:43 crc kubenswrapper[4760]: I0930 07:36:43.733194 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mtn4" event={"ID":"ebfeeebf-0ff3-4f39-a225-4bbef7a61bee","Type":"ContainerDied","Data":"6fc3a4c83771cfa1179dba9390d2fedf329c77b6bd5e0bcea4fce9b7600e8da8"} Sep 30 07:36:43 crc kubenswrapper[4760]: I0930 07:36:43.733320 4760 scope.go:117] "RemoveContainer" containerID="0be3a731f5eea49e31df742364cd070d4f7a0797f1c7543c4e575cc2bd3c4609" Sep 30 07:36:43 crc kubenswrapper[4760]: I0930 07:36:43.733494 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9mtn4" Sep 30 07:36:43 crc kubenswrapper[4760]: I0930 07:36:43.769164 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9mtn4"] Sep 30 07:36:43 crc kubenswrapper[4760]: I0930 07:36:43.774886 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9mtn4"] Sep 30 07:36:43 crc kubenswrapper[4760]: I0930 07:36:43.777531 4760 scope.go:117] "RemoveContainer" containerID="c2a1ed5135956e36ec1cfa4cf372ec13c1db47a9f6a9758287020e0b798448b3" Sep 30 07:36:43 crc kubenswrapper[4760]: I0930 07:36:43.798383 4760 scope.go:117] "RemoveContainer" containerID="ffdb865142dfd2c237049f2e2e7a02169957aacc9694c7747e58207eab063e7b" Sep 30 07:36:43 crc kubenswrapper[4760]: I0930 07:36:43.886315 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tjdgw" Sep 30 07:36:44 crc kubenswrapper[4760]: I0930 07:36:44.108037 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-52kxf" Sep 30 07:36:44 crc kubenswrapper[4760]: I0930 07:36:44.316234 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s4pkh" Sep 30 07:36:44 crc kubenswrapper[4760]: I0930 07:36:44.466801 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8f8g7" Sep 30 07:36:45 crc kubenswrapper[4760]: I0930 07:36:45.074241 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebfeeebf-0ff3-4f39-a225-4bbef7a61bee" path="/var/lib/kubelet/pods/ebfeeebf-0ff3-4f39-a225-4bbef7a61bee/volumes" Sep 30 07:36:46 crc kubenswrapper[4760]: I0930 07:36:46.772373 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s4pkh"] Sep 30 07:36:46 crc kubenswrapper[4760]: I0930 07:36:46.774424 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s4pkh" podUID="58cb5713-4587-4909-9aaa-5eae3a314c9e" containerName="registry-server" containerID="cri-o://3a2bd664924e45a33a85207cc636ccf59ecdb0d7f0016ce7db5162d88b85affb" gracePeriod=2 Sep 30 07:36:47 crc kubenswrapper[4760]: I0930 07:36:47.257260 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s4pkh" Sep 30 07:36:47 crc kubenswrapper[4760]: I0930 07:36:47.376895 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58cb5713-4587-4909-9aaa-5eae3a314c9e-catalog-content\") pod \"58cb5713-4587-4909-9aaa-5eae3a314c9e\" (UID: \"58cb5713-4587-4909-9aaa-5eae3a314c9e\") " Sep 30 07:36:47 crc kubenswrapper[4760]: I0930 07:36:47.377021 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpw29\" (UniqueName: \"kubernetes.io/projected/58cb5713-4587-4909-9aaa-5eae3a314c9e-kube-api-access-bpw29\") pod \"58cb5713-4587-4909-9aaa-5eae3a314c9e\" (UID: \"58cb5713-4587-4909-9aaa-5eae3a314c9e\") " Sep 30 07:36:47 crc kubenswrapper[4760]: I0930 07:36:47.377061 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58cb5713-4587-4909-9aaa-5eae3a314c9e-utilities\") pod \"58cb5713-4587-4909-9aaa-5eae3a314c9e\" (UID: \"58cb5713-4587-4909-9aaa-5eae3a314c9e\") " Sep 30 07:36:47 crc kubenswrapper[4760]: I0930 07:36:47.378867 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58cb5713-4587-4909-9aaa-5eae3a314c9e-utilities" (OuterVolumeSpecName: "utilities") pod "58cb5713-4587-4909-9aaa-5eae3a314c9e" (UID: "58cb5713-4587-4909-9aaa-5eae3a314c9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:36:47 crc kubenswrapper[4760]: I0930 07:36:47.401794 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58cb5713-4587-4909-9aaa-5eae3a314c9e-kube-api-access-bpw29" (OuterVolumeSpecName: "kube-api-access-bpw29") pod "58cb5713-4587-4909-9aaa-5eae3a314c9e" (UID: "58cb5713-4587-4909-9aaa-5eae3a314c9e"). InnerVolumeSpecName "kube-api-access-bpw29". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:36:47 crc kubenswrapper[4760]: I0930 07:36:47.432762 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58cb5713-4587-4909-9aaa-5eae3a314c9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58cb5713-4587-4909-9aaa-5eae3a314c9e" (UID: "58cb5713-4587-4909-9aaa-5eae3a314c9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:36:47 crc kubenswrapper[4760]: I0930 07:36:47.479756 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58cb5713-4587-4909-9aaa-5eae3a314c9e-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:36:47 crc kubenswrapper[4760]: I0930 07:36:47.479791 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpw29\" (UniqueName: \"kubernetes.io/projected/58cb5713-4587-4909-9aaa-5eae3a314c9e-kube-api-access-bpw29\") on node \"crc\" DevicePath \"\"" Sep 30 07:36:47 crc kubenswrapper[4760]: I0930 07:36:47.479804 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58cb5713-4587-4909-9aaa-5eae3a314c9e-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:36:47 crc kubenswrapper[4760]: I0930 07:36:47.760682 4760 generic.go:334] "Generic (PLEG): container finished" podID="58cb5713-4587-4909-9aaa-5eae3a314c9e" containerID="3a2bd664924e45a33a85207cc636ccf59ecdb0d7f0016ce7db5162d88b85affb" exitCode=0 Sep 30 07:36:47 crc kubenswrapper[4760]: I0930 07:36:47.760724 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s4pkh" event={"ID":"58cb5713-4587-4909-9aaa-5eae3a314c9e","Type":"ContainerDied","Data":"3a2bd664924e45a33a85207cc636ccf59ecdb0d7f0016ce7db5162d88b85affb"} Sep 30 07:36:47 crc kubenswrapper[4760]: I0930 07:36:47.760750 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s4pkh" event={"ID":"58cb5713-4587-4909-9aaa-5eae3a314c9e","Type":"ContainerDied","Data":"c056d9e7473f3b40a0c5d9293f8348d9bdbb7664756d2392da5a7ed137462847"} Sep 30 07:36:47 crc kubenswrapper[4760]: I0930 07:36:47.760766 4760 scope.go:117] "RemoveContainer" containerID="3a2bd664924e45a33a85207cc636ccf59ecdb0d7f0016ce7db5162d88b85affb" Sep 30 07:36:47 crc kubenswrapper[4760]: I0930 07:36:47.760892 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s4pkh" Sep 30 07:36:47 crc kubenswrapper[4760]: I0930 07:36:47.769261 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8f8g7"] Sep 30 07:36:47 crc kubenswrapper[4760]: I0930 07:36:47.769537 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8f8g7" podUID="93384c2b-6a0e-41e8-a873-501fb43090a5" containerName="registry-server" containerID="cri-o://6cd8db9f360884e8ba75357abab181841f8828804da237f8dbf42a32432e1254" gracePeriod=2 Sep 30 07:36:47 crc kubenswrapper[4760]: I0930 07:36:47.777459 4760 scope.go:117] "RemoveContainer" containerID="95d45ba661a925bc72a4652aa7a0090394b0f4463f5f57da234ae94aeccac51f" Sep 30 07:36:47 crc kubenswrapper[4760]: I0930 07:36:47.791979 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s4pkh"] Sep 30 07:36:47 crc kubenswrapper[4760]: I0930 07:36:47.795862 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s4pkh"] Sep 30 07:36:47 crc kubenswrapper[4760]: I0930 07:36:47.814977 4760 scope.go:117] "RemoveContainer" containerID="edc0c4ac0dd66b6ae4c241d280e5685d9b96ebcf01d4765777e9f1a3a2dda9a4" Sep 30 07:36:47 crc kubenswrapper[4760]: I0930 07:36:47.832754 4760 scope.go:117] "RemoveContainer" containerID="3a2bd664924e45a33a85207cc636ccf59ecdb0d7f0016ce7db5162d88b85affb" Sep 30 07:36:47 crc kubenswrapper[4760]: E0930 07:36:47.833059 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a2bd664924e45a33a85207cc636ccf59ecdb0d7f0016ce7db5162d88b85affb\": container with ID starting with 3a2bd664924e45a33a85207cc636ccf59ecdb0d7f0016ce7db5162d88b85affb not found: ID does not exist" containerID="3a2bd664924e45a33a85207cc636ccf59ecdb0d7f0016ce7db5162d88b85affb" Sep 30 07:36:47 crc kubenswrapper[4760]: I0930 07:36:47.833094 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a2bd664924e45a33a85207cc636ccf59ecdb0d7f0016ce7db5162d88b85affb"} err="failed to get container status \"3a2bd664924e45a33a85207cc636ccf59ecdb0d7f0016ce7db5162d88b85affb\": rpc error: code = NotFound desc = could not find container \"3a2bd664924e45a33a85207cc636ccf59ecdb0d7f0016ce7db5162d88b85affb\": container with ID starting with 3a2bd664924e45a33a85207cc636ccf59ecdb0d7f0016ce7db5162d88b85affb not found: ID does not exist" Sep 30 07:36:47 crc kubenswrapper[4760]: I0930 07:36:47.833115 4760 scope.go:117] "RemoveContainer" containerID="95d45ba661a925bc72a4652aa7a0090394b0f4463f5f57da234ae94aeccac51f" Sep 30 07:36:47 crc kubenswrapper[4760]: E0930 07:36:47.834092 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95d45ba661a925bc72a4652aa7a0090394b0f4463f5f57da234ae94aeccac51f\": container with ID starting with 95d45ba661a925bc72a4652aa7a0090394b0f4463f5f57da234ae94aeccac51f not found: ID does not exist" containerID="95d45ba661a925bc72a4652aa7a0090394b0f4463f5f57da234ae94aeccac51f" Sep 30 07:36:47 crc kubenswrapper[4760]: I0930 07:36:47.834114 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95d45ba661a925bc72a4652aa7a0090394b0f4463f5f57da234ae94aeccac51f"} err="failed to get container status \"95d45ba661a925bc72a4652aa7a0090394b0f4463f5f57da234ae94aeccac51f\": rpc error: code = NotFound desc = could not find container \"95d45ba661a925bc72a4652aa7a0090394b0f4463f5f57da234ae94aeccac51f\": container with ID starting with 95d45ba661a925bc72a4652aa7a0090394b0f4463f5f57da234ae94aeccac51f not found: ID does not exist" Sep 30 07:36:47 crc kubenswrapper[4760]: I0930 07:36:47.834128 4760 scope.go:117] "RemoveContainer" containerID="edc0c4ac0dd66b6ae4c241d280e5685d9b96ebcf01d4765777e9f1a3a2dda9a4" Sep 30 07:36:47 crc kubenswrapper[4760]: E0930 07:36:47.834435 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edc0c4ac0dd66b6ae4c241d280e5685d9b96ebcf01d4765777e9f1a3a2dda9a4\": container with ID starting with edc0c4ac0dd66b6ae4c241d280e5685d9b96ebcf01d4765777e9f1a3a2dda9a4 not found: ID does not exist" containerID="edc0c4ac0dd66b6ae4c241d280e5685d9b96ebcf01d4765777e9f1a3a2dda9a4" Sep 30 07:36:47 crc kubenswrapper[4760]: I0930 07:36:47.834453 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edc0c4ac0dd66b6ae4c241d280e5685d9b96ebcf01d4765777e9f1a3a2dda9a4"} err="failed to get container status \"edc0c4ac0dd66b6ae4c241d280e5685d9b96ebcf01d4765777e9f1a3a2dda9a4\": rpc error: code = NotFound desc = could not find container \"edc0c4ac0dd66b6ae4c241d280e5685d9b96ebcf01d4765777e9f1a3a2dda9a4\": container with ID starting with edc0c4ac0dd66b6ae4c241d280e5685d9b96ebcf01d4765777e9f1a3a2dda9a4 not found: ID does not exist" Sep 30 07:36:48 crc kubenswrapper[4760]: I0930 07:36:48.174169 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8f8g7" Sep 30 07:36:48 crc kubenswrapper[4760]: I0930 07:36:48.288825 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93384c2b-6a0e-41e8-a873-501fb43090a5-catalog-content\") pod \"93384c2b-6a0e-41e8-a873-501fb43090a5\" (UID: \"93384c2b-6a0e-41e8-a873-501fb43090a5\") " Sep 30 07:36:48 crc kubenswrapper[4760]: I0930 07:36:48.288870 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93384c2b-6a0e-41e8-a873-501fb43090a5-utilities\") pod \"93384c2b-6a0e-41e8-a873-501fb43090a5\" (UID: \"93384c2b-6a0e-41e8-a873-501fb43090a5\") " Sep 30 07:36:48 crc kubenswrapper[4760]: I0930 07:36:48.288938 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqfz7\" (UniqueName: \"kubernetes.io/projected/93384c2b-6a0e-41e8-a873-501fb43090a5-kube-api-access-nqfz7\") pod \"93384c2b-6a0e-41e8-a873-501fb43090a5\" (UID: \"93384c2b-6a0e-41e8-a873-501fb43090a5\") " Sep 30 07:36:48 crc kubenswrapper[4760]: I0930 07:36:48.289820 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93384c2b-6a0e-41e8-a873-501fb43090a5-utilities" (OuterVolumeSpecName: "utilities") pod "93384c2b-6a0e-41e8-a873-501fb43090a5" (UID: "93384c2b-6a0e-41e8-a873-501fb43090a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:36:48 crc kubenswrapper[4760]: I0930 07:36:48.294515 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93384c2b-6a0e-41e8-a873-501fb43090a5-kube-api-access-nqfz7" (OuterVolumeSpecName: "kube-api-access-nqfz7") pod "93384c2b-6a0e-41e8-a873-501fb43090a5" (UID: "93384c2b-6a0e-41e8-a873-501fb43090a5"). InnerVolumeSpecName "kube-api-access-nqfz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:36:48 crc kubenswrapper[4760]: I0930 07:36:48.340747 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93384c2b-6a0e-41e8-a873-501fb43090a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93384c2b-6a0e-41e8-a873-501fb43090a5" (UID: "93384c2b-6a0e-41e8-a873-501fb43090a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:36:48 crc kubenswrapper[4760]: I0930 07:36:48.390853 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqfz7\" (UniqueName: \"kubernetes.io/projected/93384c2b-6a0e-41e8-a873-501fb43090a5-kube-api-access-nqfz7\") on node \"crc\" DevicePath \"\"" Sep 30 07:36:48 crc kubenswrapper[4760]: I0930 07:36:48.390885 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93384c2b-6a0e-41e8-a873-501fb43090a5-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:36:48 crc kubenswrapper[4760]: I0930 07:36:48.390895 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93384c2b-6a0e-41e8-a873-501fb43090a5-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:36:48 crc kubenswrapper[4760]: I0930 07:36:48.769944 4760 generic.go:334] "Generic (PLEG): container finished" podID="93384c2b-6a0e-41e8-a873-501fb43090a5" containerID="6cd8db9f360884e8ba75357abab181841f8828804da237f8dbf42a32432e1254" exitCode=0 Sep 30 07:36:48 crc kubenswrapper[4760]: I0930 07:36:48.769985 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8f8g7" event={"ID":"93384c2b-6a0e-41e8-a873-501fb43090a5","Type":"ContainerDied","Data":"6cd8db9f360884e8ba75357abab181841f8828804da237f8dbf42a32432e1254"} Sep 30 07:36:48 crc kubenswrapper[4760]: I0930 07:36:48.770255 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8f8g7" event={"ID":"93384c2b-6a0e-41e8-a873-501fb43090a5","Type":"ContainerDied","Data":"05c8f101fa3f67c9d42e072cce16ae9ef5c75e752ca1eb4b026ae3c1d0296d5b"} Sep 30 07:36:48 crc kubenswrapper[4760]: I0930 07:36:48.770273 4760 scope.go:117] "RemoveContainer" containerID="6cd8db9f360884e8ba75357abab181841f8828804da237f8dbf42a32432e1254" Sep 30 07:36:48 crc kubenswrapper[4760]: I0930 07:36:48.770012 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8f8g7" Sep 30 07:36:48 crc kubenswrapper[4760]: I0930 07:36:48.794740 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8f8g7"] Sep 30 07:36:48 crc kubenswrapper[4760]: I0930 07:36:48.796005 4760 scope.go:117] "RemoveContainer" containerID="f5d7ed56864a9b67d94649dc3780105e1b394f238e9e5cd472c772ab93537cbb" Sep 30 07:36:48 crc kubenswrapper[4760]: I0930 07:36:48.804559 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8f8g7"] Sep 30 07:36:48 crc kubenswrapper[4760]: I0930 07:36:48.827319 4760 scope.go:117] "RemoveContainer" containerID="665b5e019ceb22d177916e0ad39b164434ac71f0f641b24a716f4e4c0015aa76" Sep 30 07:36:48 crc kubenswrapper[4760]: I0930 07:36:48.844903 4760 scope.go:117] "RemoveContainer" containerID="6cd8db9f360884e8ba75357abab181841f8828804da237f8dbf42a32432e1254" Sep 30 07:36:48 crc kubenswrapper[4760]: E0930 07:36:48.845508 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cd8db9f360884e8ba75357abab181841f8828804da237f8dbf42a32432e1254\": container with ID starting with 6cd8db9f360884e8ba75357abab181841f8828804da237f8dbf42a32432e1254 not found: ID does not exist" containerID="6cd8db9f360884e8ba75357abab181841f8828804da237f8dbf42a32432e1254" Sep 30 07:36:48 crc kubenswrapper[4760]: I0930 07:36:48.845542 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cd8db9f360884e8ba75357abab181841f8828804da237f8dbf42a32432e1254"} err="failed to get container status \"6cd8db9f360884e8ba75357abab181841f8828804da237f8dbf42a32432e1254\": rpc error: code = NotFound desc = could not find container \"6cd8db9f360884e8ba75357abab181841f8828804da237f8dbf42a32432e1254\": container with ID starting with 6cd8db9f360884e8ba75357abab181841f8828804da237f8dbf42a32432e1254 not found: ID does not exist" Sep 30 07:36:48 crc kubenswrapper[4760]: I0930 07:36:48.845566 4760 scope.go:117] "RemoveContainer" containerID="f5d7ed56864a9b67d94649dc3780105e1b394f238e9e5cd472c772ab93537cbb" Sep 30 07:36:48 crc kubenswrapper[4760]: E0930 07:36:48.845999 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5d7ed56864a9b67d94649dc3780105e1b394f238e9e5cd472c772ab93537cbb\": container with ID starting with f5d7ed56864a9b67d94649dc3780105e1b394f238e9e5cd472c772ab93537cbb not found: ID does not exist" containerID="f5d7ed56864a9b67d94649dc3780105e1b394f238e9e5cd472c772ab93537cbb" Sep 30 07:36:48 crc kubenswrapper[4760]: I0930 07:36:48.846019 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5d7ed56864a9b67d94649dc3780105e1b394f238e9e5cd472c772ab93537cbb"} err="failed to get container status \"f5d7ed56864a9b67d94649dc3780105e1b394f238e9e5cd472c772ab93537cbb\": rpc error: code = NotFound desc = could not find container \"f5d7ed56864a9b67d94649dc3780105e1b394f238e9e5cd472c772ab93537cbb\": container with ID starting with f5d7ed56864a9b67d94649dc3780105e1b394f238e9e5cd472c772ab93537cbb not found: ID does not exist" Sep 30 07:36:48 crc kubenswrapper[4760]: I0930 07:36:48.846032 4760 scope.go:117] "RemoveContainer" containerID="665b5e019ceb22d177916e0ad39b164434ac71f0f641b24a716f4e4c0015aa76" Sep 30 07:36:48 crc kubenswrapper[4760]: E0930 07:36:48.846284 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"665b5e019ceb22d177916e0ad39b164434ac71f0f641b24a716f4e4c0015aa76\": container with ID starting with 665b5e019ceb22d177916e0ad39b164434ac71f0f641b24a716f4e4c0015aa76 not found: ID does not exist" containerID="665b5e019ceb22d177916e0ad39b164434ac71f0f641b24a716f4e4c0015aa76" Sep 30 07:36:48 crc kubenswrapper[4760]: I0930 07:36:48.846414 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"665b5e019ceb22d177916e0ad39b164434ac71f0f641b24a716f4e4c0015aa76"} err="failed to get container status \"665b5e019ceb22d177916e0ad39b164434ac71f0f641b24a716f4e4c0015aa76\": rpc error: code = NotFound desc = could not find container \"665b5e019ceb22d177916e0ad39b164434ac71f0f641b24a716f4e4c0015aa76\": container with ID starting with 665b5e019ceb22d177916e0ad39b164434ac71f0f641b24a716f4e4c0015aa76 not found: ID does not exist" Sep 30 07:36:49 crc kubenswrapper[4760]: I0930 07:36:49.076734 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58cb5713-4587-4909-9aaa-5eae3a314c9e" path="/var/lib/kubelet/pods/58cb5713-4587-4909-9aaa-5eae3a314c9e/volumes" Sep 30 07:36:49 crc kubenswrapper[4760]: I0930 07:36:49.077701 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93384c2b-6a0e-41e8-a873-501fb43090a5" path="/var/lib/kubelet/pods/93384c2b-6a0e-41e8-a873-501fb43090a5/volumes" Sep 30 07:36:49 crc kubenswrapper[4760]: I0930 07:36:49.113334 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:36:49 crc kubenswrapper[4760]: I0930 07:36:49.113445 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:36:56 crc kubenswrapper[4760]: I0930 07:36:56.129365 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-77ttc"] Sep 30 07:37:19 crc kubenswrapper[4760]: I0930 07:37:19.113257 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:37:19 crc kubenswrapper[4760]: I0930 07:37:19.114019 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:37:19 crc kubenswrapper[4760]: I0930 07:37:19.114106 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 07:37:19 crc kubenswrapper[4760]: I0930 07:37:19.115344 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7"} pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 07:37:19 crc kubenswrapper[4760]: I0930 07:37:19.115455 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" containerID="cri-o://90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7" gracePeriod=600 Sep 30 07:37:19 crc kubenswrapper[4760]: I0930 07:37:19.955636 4760 generic.go:334] "Generic (PLEG): container finished" podID="7a9c8270-6964-4886-87d0-227b05b76da4" containerID="90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7" exitCode=0 Sep 30 07:37:19 crc kubenswrapper[4760]: I0930 07:37:19.955761 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerDied","Data":"90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7"} Sep 30 07:37:19 crc kubenswrapper[4760]: I0930 07:37:19.956464 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerStarted","Data":"316186f24b90a3f80ce14b2c1f47627d59bb457c83e7ed3a00cd62894b2b866d"} Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.152925 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" podUID="f2f25243-2a0b-498f-8de6-8b0a21c72c49" containerName="oauth-openshift" containerID="cri-o://323a28996706e5ebe7872e9428bd083a04ad7970621cd63de64d2d6cf9c3fad6" gracePeriod=15 Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.657252 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.704109 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-user-idp-0-file-data\") pod \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.704204 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-service-ca\") pod \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.704258 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-cliconfig\") pod \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.704327 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f2f25243-2a0b-498f-8de6-8b0a21c72c49-audit-policies\") pod \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.704372 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-router-certs\") pod \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.704405 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-session\") pod \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.704474 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-serving-cert\") pod \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.704530 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-trusted-ca-bundle\") pod \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.704588 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f2f25243-2a0b-498f-8de6-8b0a21c72c49-audit-dir\") pod \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.704667 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-user-template-login\") pod \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.704707 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-user-template-error\") pod \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.704754 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-user-template-provider-selection\") pod \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.704798 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtgwl\" (UniqueName: \"kubernetes.io/projected/f2f25243-2a0b-498f-8de6-8b0a21c72c49-kube-api-access-xtgwl\") pod \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.704857 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-ocp-branding-template\") pod \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\" (UID: \"f2f25243-2a0b-498f-8de6-8b0a21c72c49\") " Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.706780 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "f2f25243-2a0b-498f-8de6-8b0a21c72c49" (UID: "f2f25243-2a0b-498f-8de6-8b0a21c72c49"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.706829 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "f2f25243-2a0b-498f-8de6-8b0a21c72c49" (UID: "f2f25243-2a0b-498f-8de6-8b0a21c72c49"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.706896 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-679fb67f4b-dgqcq"] Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.707030 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2f25243-2a0b-498f-8de6-8b0a21c72c49-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f2f25243-2a0b-498f-8de6-8b0a21c72c49" (UID: "f2f25243-2a0b-498f-8de6-8b0a21c72c49"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:37:21 crc kubenswrapper[4760]: E0930 07:37:21.707349 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58cb5713-4587-4909-9aaa-5eae3a314c9e" containerName="extract-content" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.707378 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="58cb5713-4587-4909-9aaa-5eae3a314c9e" containerName="extract-content" Sep 30 07:37:21 crc kubenswrapper[4760]: E0930 07:37:21.707405 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d9daed2-2539-4be9-995a-0972b45d9b96" containerName="pruner" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.707422 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d9daed2-2539-4be9-995a-0972b45d9b96" containerName="pruner" Sep 30 07:37:21 crc kubenswrapper[4760]: E0930 07:37:21.707444 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93384c2b-6a0e-41e8-a873-501fb43090a5" containerName="registry-server" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.707460 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="93384c2b-6a0e-41e8-a873-501fb43090a5" containerName="registry-server" Sep 30 07:37:21 crc kubenswrapper[4760]: E0930 07:37:21.707487 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2f25243-2a0b-498f-8de6-8b0a21c72c49" containerName="oauth-openshift" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.707505 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f25243-2a0b-498f-8de6-8b0a21c72c49" containerName="oauth-openshift" Sep 30 07:37:21 crc kubenswrapper[4760]: E0930 07:37:21.707527 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebfeeebf-0ff3-4f39-a225-4bbef7a61bee" containerName="registry-server" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.707544 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebfeeebf-0ff3-4f39-a225-4bbef7a61bee" containerName="registry-server" Sep 30 07:37:21 crc kubenswrapper[4760]: E0930 07:37:21.707573 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a700923-3c70-481e-9fad-c9de1e186301" containerName="extract-utilities" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.707589 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a700923-3c70-481e-9fad-c9de1e186301" containerName="extract-utilities" Sep 30 07:37:21 crc kubenswrapper[4760]: E0930 07:37:21.707615 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93384c2b-6a0e-41e8-a873-501fb43090a5" containerName="extract-utilities" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.707631 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="93384c2b-6a0e-41e8-a873-501fb43090a5" containerName="extract-utilities" Sep 30 07:37:21 crc kubenswrapper[4760]: E0930 07:37:21.707655 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebfeeebf-0ff3-4f39-a225-4bbef7a61bee" containerName="extract-utilities" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.707669 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebfeeebf-0ff3-4f39-a225-4bbef7a61bee" containerName="extract-utilities" Sep 30 07:37:21 crc kubenswrapper[4760]: E0930 07:37:21.707691 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a700923-3c70-481e-9fad-c9de1e186301" containerName="registry-server" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.707707 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a700923-3c70-481e-9fad-c9de1e186301" containerName="registry-server" Sep 30 07:37:21 crc kubenswrapper[4760]: E0930 07:37:21.707733 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebfeeebf-0ff3-4f39-a225-4bbef7a61bee" containerName="extract-content" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.707749 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebfeeebf-0ff3-4f39-a225-4bbef7a61bee" containerName="extract-content" Sep 30 07:37:21 crc kubenswrapper[4760]: E0930 07:37:21.707772 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58cb5713-4587-4909-9aaa-5eae3a314c9e" containerName="extract-utilities" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.707788 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="58cb5713-4587-4909-9aaa-5eae3a314c9e" containerName="extract-utilities" Sep 30 07:37:21 crc kubenswrapper[4760]: E0930 07:37:21.707813 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93384c2b-6a0e-41e8-a873-501fb43090a5" containerName="extract-content" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.707830 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="93384c2b-6a0e-41e8-a873-501fb43090a5" containerName="extract-content" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.707824 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2f25243-2a0b-498f-8de6-8b0a21c72c49-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f2f25243-2a0b-498f-8de6-8b0a21c72c49" (UID: "f2f25243-2a0b-498f-8de6-8b0a21c72c49"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:37:21 crc kubenswrapper[4760]: E0930 07:37:21.707848 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58cb5713-4587-4909-9aaa-5eae3a314c9e" containerName="registry-server" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.707889 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="58cb5713-4587-4909-9aaa-5eae3a314c9e" containerName="registry-server" Sep 30 07:37:21 crc kubenswrapper[4760]: E0930 07:37:21.707915 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a700923-3c70-481e-9fad-c9de1e186301" containerName="extract-content" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.707931 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a700923-3c70-481e-9fad-c9de1e186301" containerName="extract-content" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.708159 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d9daed2-2539-4be9-995a-0972b45d9b96" containerName="pruner" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.708191 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="58cb5713-4587-4909-9aaa-5eae3a314c9e" containerName="registry-server" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.708210 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebfeeebf-0ff3-4f39-a225-4bbef7a61bee" containerName="registry-server" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.708242 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a700923-3c70-481e-9fad-c9de1e186301" containerName="registry-server" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.708266 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2f25243-2a0b-498f-8de6-8b0a21c72c49" containerName="oauth-openshift" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.708285 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="93384c2b-6a0e-41e8-a873-501fb43090a5" containerName="registry-server" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.708491 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "f2f25243-2a0b-498f-8de6-8b0a21c72c49" (UID: "f2f25243-2a0b-498f-8de6-8b0a21c72c49"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.709221 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.726269 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "f2f25243-2a0b-498f-8de6-8b0a21c72c49" (UID: "f2f25243-2a0b-498f-8de6-8b0a21c72c49"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.726339 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-679fb67f4b-dgqcq"] Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.727129 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "f2f25243-2a0b-498f-8de6-8b0a21c72c49" (UID: "f2f25243-2a0b-498f-8de6-8b0a21c72c49"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.727393 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2f25243-2a0b-498f-8de6-8b0a21c72c49-kube-api-access-xtgwl" (OuterVolumeSpecName: "kube-api-access-xtgwl") pod "f2f25243-2a0b-498f-8de6-8b0a21c72c49" (UID: "f2f25243-2a0b-498f-8de6-8b0a21c72c49"). InnerVolumeSpecName "kube-api-access-xtgwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.727979 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "f2f25243-2a0b-498f-8de6-8b0a21c72c49" (UID: "f2f25243-2a0b-498f-8de6-8b0a21c72c49"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.728961 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "f2f25243-2a0b-498f-8de6-8b0a21c72c49" (UID: "f2f25243-2a0b-498f-8de6-8b0a21c72c49"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.730826 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "f2f25243-2a0b-498f-8de6-8b0a21c72c49" (UID: "f2f25243-2a0b-498f-8de6-8b0a21c72c49"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.739040 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "f2f25243-2a0b-498f-8de6-8b0a21c72c49" (UID: "f2f25243-2a0b-498f-8de6-8b0a21c72c49"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.739853 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "f2f25243-2a0b-498f-8de6-8b0a21c72c49" (UID: "f2f25243-2a0b-498f-8de6-8b0a21c72c49"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.745421 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "f2f25243-2a0b-498f-8de6-8b0a21c72c49" (UID: "f2f25243-2a0b-498f-8de6-8b0a21c72c49"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.805926 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a89b40e6-b924-4693-8cb0-a2ac8ab79162-audit-dir\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.805978 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.806026 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-system-session\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.806051 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-system-cliconfig\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.806076 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-system-service-ca\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.806145 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.806223 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-system-serving-cert\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.806326 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hqbw\" (UniqueName: \"kubernetes.io/projected/a89b40e6-b924-4693-8cb0-a2ac8ab79162-kube-api-access-6hqbw\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.806373 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-user-template-login\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.806407 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.806435 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-user-template-error\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.806458 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.806534 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-system-router-certs\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.806563 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a89b40e6-b924-4693-8cb0-a2ac8ab79162-audit-policies\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.806637 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.806648 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.806658 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.806668 4760 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f2f25243-2a0b-498f-8de6-8b0a21c72c49-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.806678 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.806740 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.806754 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.806765 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.806779 4760 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f2f25243-2a0b-498f-8de6-8b0a21c72c49-audit-dir\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.806790 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.806799 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.806808 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.806816 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtgwl\" (UniqueName: \"kubernetes.io/projected/f2f25243-2a0b-498f-8de6-8b0a21c72c49-kube-api-access-xtgwl\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.806826 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f2f25243-2a0b-498f-8de6-8b0a21c72c49-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.908567 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a89b40e6-b924-4693-8cb0-a2ac8ab79162-audit-policies\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.908961 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a89b40e6-b924-4693-8cb0-a2ac8ab79162-audit-dir\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.909049 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a89b40e6-b924-4693-8cb0-a2ac8ab79162-audit-dir\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.909128 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.909240 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-system-session\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.909278 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-system-cliconfig\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.909371 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-system-service-ca\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.909406 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.909370 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a89b40e6-b924-4693-8cb0-a2ac8ab79162-audit-policies\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.909448 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-system-serving-cert\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.909630 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hqbw\" (UniqueName: \"kubernetes.io/projected/a89b40e6-b924-4693-8cb0-a2ac8ab79162-kube-api-access-6hqbw\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.909717 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-user-template-login\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.909772 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.909830 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-user-template-error\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.909866 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.909896 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-system-router-certs\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.910368 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.910729 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-system-service-ca\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.911610 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-system-cliconfig\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.913799 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-system-session\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.913861 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-system-router-certs\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.914814 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-user-template-login\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.915169 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.915266 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-system-serving-cert\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.916243 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-user-template-error\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.917345 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.922684 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a89b40e6-b924-4693-8cb0-a2ac8ab79162-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.942584 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hqbw\" (UniqueName: \"kubernetes.io/projected/a89b40e6-b924-4693-8cb0-a2ac8ab79162-kube-api-access-6hqbw\") pod \"oauth-openshift-679fb67f4b-dgqcq\" (UID: \"a89b40e6-b924-4693-8cb0-a2ac8ab79162\") " pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.970604 4760 generic.go:334] "Generic (PLEG): container finished" podID="f2f25243-2a0b-498f-8de6-8b0a21c72c49" containerID="323a28996706e5ebe7872e9428bd083a04ad7970621cd63de64d2d6cf9c3fad6" exitCode=0 Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.970677 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.970678 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" event={"ID":"f2f25243-2a0b-498f-8de6-8b0a21c72c49","Type":"ContainerDied","Data":"323a28996706e5ebe7872e9428bd083a04ad7970621cd63de64d2d6cf9c3fad6"} Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.970839 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-77ttc" event={"ID":"f2f25243-2a0b-498f-8de6-8b0a21c72c49","Type":"ContainerDied","Data":"e36a05fee101b4f8fabf93553fe763dedb5db45b5cc415b145b5e1ab8c7eea5f"} Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.970873 4760 scope.go:117] "RemoveContainer" containerID="323a28996706e5ebe7872e9428bd083a04ad7970621cd63de64d2d6cf9c3fad6" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.992745 4760 scope.go:117] "RemoveContainer" containerID="323a28996706e5ebe7872e9428bd083a04ad7970621cd63de64d2d6cf9c3fad6" Sep 30 07:37:21 crc kubenswrapper[4760]: E0930 07:37:21.993238 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"323a28996706e5ebe7872e9428bd083a04ad7970621cd63de64d2d6cf9c3fad6\": container with ID starting with 323a28996706e5ebe7872e9428bd083a04ad7970621cd63de64d2d6cf9c3fad6 not found: ID does not exist" containerID="323a28996706e5ebe7872e9428bd083a04ad7970621cd63de64d2d6cf9c3fad6" Sep 30 07:37:21 crc kubenswrapper[4760]: I0930 07:37:21.993286 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"323a28996706e5ebe7872e9428bd083a04ad7970621cd63de64d2d6cf9c3fad6"} err="failed to get container status \"323a28996706e5ebe7872e9428bd083a04ad7970621cd63de64d2d6cf9c3fad6\": rpc error: code = NotFound desc = could not find container \"323a28996706e5ebe7872e9428bd083a04ad7970621cd63de64d2d6cf9c3fad6\": container with ID starting with 323a28996706e5ebe7872e9428bd083a04ad7970621cd63de64d2d6cf9c3fad6 not found: ID does not exist" Sep 30 07:37:22 crc kubenswrapper[4760]: I0930 07:37:22.016248 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-77ttc"] Sep 30 07:37:22 crc kubenswrapper[4760]: I0930 07:37:22.022940 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-77ttc"] Sep 30 07:37:22 crc kubenswrapper[4760]: I0930 07:37:22.077526 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:22 crc kubenswrapper[4760]: I0930 07:37:22.566820 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-679fb67f4b-dgqcq"] Sep 30 07:37:22 crc kubenswrapper[4760]: I0930 07:37:22.982724 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" event={"ID":"a89b40e6-b924-4693-8cb0-a2ac8ab79162","Type":"ContainerStarted","Data":"7daaac937284bc63c11e062a5214540391990e8001730fe03099b8c980605395"} Sep 30 07:37:22 crc kubenswrapper[4760]: I0930 07:37:22.983176 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" event={"ID":"a89b40e6-b924-4693-8cb0-a2ac8ab79162","Type":"ContainerStarted","Data":"f6bbaf5b23f24aad942000bc16b7b9b39d38c5e627155e6d7574312ff8f4fe58"} Sep 30 07:37:22 crc kubenswrapper[4760]: I0930 07:37:22.983214 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:23 crc kubenswrapper[4760]: I0930 07:37:23.028551 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" podStartSLOduration=27.028523674 podStartE2EDuration="27.028523674s" podCreationTimestamp="2025-09-30 07:36:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:37:23.016642663 +0000 UTC m=+228.659549115" watchObservedRunningTime="2025-09-30 07:37:23.028523674 +0000 UTC m=+228.671430126" Sep 30 07:37:23 crc kubenswrapper[4760]: I0930 07:37:23.080539 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2f25243-2a0b-498f-8de6-8b0a21c72c49" path="/var/lib/kubelet/pods/f2f25243-2a0b-498f-8de6-8b0a21c72c49/volumes" Sep 30 07:37:23 crc kubenswrapper[4760]: I0930 07:37:23.496056 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-679fb67f4b-dgqcq" Sep 30 07:37:53 crc kubenswrapper[4760]: I0930 07:37:53.823552 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-52kxf"] Sep 30 07:37:53 crc kubenswrapper[4760]: I0930 07:37:53.824598 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-52kxf" podUID="ffeadfaf-9c4c-4dce-99d5-36f0e42571d7" containerName="registry-server" containerID="cri-o://503d47c18b5bbd7eb35699ca96edf4c3989e91f6e3946b2c68e92e5594d1c2ab" gracePeriod=30 Sep 30 07:37:53 crc kubenswrapper[4760]: I0930 07:37:53.844117 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tjdgw"] Sep 30 07:37:53 crc kubenswrapper[4760]: I0930 07:37:53.845251 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tjdgw" podUID="a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc" containerName="registry-server" containerID="cri-o://8dd1d07f8be35389c3eed219a69f21128a2506ffc0d49eea49d0075d7f69a8d8" gracePeriod=30 Sep 30 07:37:53 crc kubenswrapper[4760]: I0930 07:37:53.850441 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gsddw"] Sep 30 07:37:53 crc kubenswrapper[4760]: I0930 07:37:53.850700 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-gsddw" podUID="cc4f6325-5a2a-4f15-8ee9-860617a9d7ce" containerName="marketplace-operator" containerID="cri-o://52bb5880b1ba1e2a699223268f5ecb0f555d929cf119d904d6704135df89b0ae" gracePeriod=30 Sep 30 07:37:53 crc kubenswrapper[4760]: I0930 07:37:53.860350 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dp82r"] Sep 30 07:37:53 crc kubenswrapper[4760]: I0930 07:37:53.860590 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dp82r" podUID="36f4e442-c66f-4b6b-b6b3-98c8fd676ba7" containerName="registry-server" containerID="cri-o://90b088c8681826e2d5b74cf2e1c2733f3398f9126ac83ca553f04a107c6948be" gracePeriod=30 Sep 30 07:37:53 crc kubenswrapper[4760]: I0930 07:37:53.882333 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zj9hk"] Sep 30 07:37:53 crc kubenswrapper[4760]: I0930 07:37:53.882850 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zj9hk" podUID="97af4546-9849-4500-a18e-994ec8158af0" containerName="registry-server" containerID="cri-o://0d990bd19eafbf236c77646298fac63e37e3328e111bc2ed19e935062999d069" gracePeriod=30 Sep 30 07:37:53 crc kubenswrapper[4760]: I0930 07:37:53.889260 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cqqsj"] Sep 30 07:37:53 crc kubenswrapper[4760]: I0930 07:37:53.890379 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cqqsj" Sep 30 07:37:53 crc kubenswrapper[4760]: I0930 07:37:53.900797 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cqqsj"] Sep 30 07:37:53 crc kubenswrapper[4760]: I0930 07:37:53.941104 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-tjdgw" podUID="a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc" containerName="registry-server" probeResult="failure" output="" Sep 30 07:37:53 crc kubenswrapper[4760]: I0930 07:37:53.941498 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-tjdgw" podUID="a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc" containerName="registry-server" probeResult="failure" output="" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.010776 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97e466f4-974e-4d3c-b041-c4d01ad15fb4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cqqsj\" (UID: \"97e466f4-974e-4d3c-b041-c4d01ad15fb4\") " pod="openshift-marketplace/marketplace-operator-79b997595-cqqsj" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.010822 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpldg\" (UniqueName: \"kubernetes.io/projected/97e466f4-974e-4d3c-b041-c4d01ad15fb4-kube-api-access-rpldg\") pod \"marketplace-operator-79b997595-cqqsj\" (UID: \"97e466f4-974e-4d3c-b041-c4d01ad15fb4\") " pod="openshift-marketplace/marketplace-operator-79b997595-cqqsj" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.010843 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/97e466f4-974e-4d3c-b041-c4d01ad15fb4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cqqsj\" (UID: \"97e466f4-974e-4d3c-b041-c4d01ad15fb4\") " pod="openshift-marketplace/marketplace-operator-79b997595-cqqsj" Sep 30 07:37:54 crc kubenswrapper[4760]: E0930 07:37:54.061775 4760 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 503d47c18b5bbd7eb35699ca96edf4c3989e91f6e3946b2c68e92e5594d1c2ab is running failed: container process not found" containerID="503d47c18b5bbd7eb35699ca96edf4c3989e91f6e3946b2c68e92e5594d1c2ab" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 07:37:54 crc kubenswrapper[4760]: E0930 07:37:54.062384 4760 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 503d47c18b5bbd7eb35699ca96edf4c3989e91f6e3946b2c68e92e5594d1c2ab is running failed: container process not found" containerID="503d47c18b5bbd7eb35699ca96edf4c3989e91f6e3946b2c68e92e5594d1c2ab" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 07:37:54 crc kubenswrapper[4760]: E0930 07:37:54.062682 4760 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 503d47c18b5bbd7eb35699ca96edf4c3989e91f6e3946b2c68e92e5594d1c2ab is running failed: container process not found" containerID="503d47c18b5bbd7eb35699ca96edf4c3989e91f6e3946b2c68e92e5594d1c2ab" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 07:37:54 crc kubenswrapper[4760]: E0930 07:37:54.062718 4760 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 503d47c18b5bbd7eb35699ca96edf4c3989e91f6e3946b2c68e92e5594d1c2ab is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-52kxf" podUID="ffeadfaf-9c4c-4dce-99d5-36f0e42571d7" containerName="registry-server" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.112094 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97e466f4-974e-4d3c-b041-c4d01ad15fb4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cqqsj\" (UID: \"97e466f4-974e-4d3c-b041-c4d01ad15fb4\") " pod="openshift-marketplace/marketplace-operator-79b997595-cqqsj" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.112146 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpldg\" (UniqueName: \"kubernetes.io/projected/97e466f4-974e-4d3c-b041-c4d01ad15fb4-kube-api-access-rpldg\") pod \"marketplace-operator-79b997595-cqqsj\" (UID: \"97e466f4-974e-4d3c-b041-c4d01ad15fb4\") " pod="openshift-marketplace/marketplace-operator-79b997595-cqqsj" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.112164 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/97e466f4-974e-4d3c-b041-c4d01ad15fb4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cqqsj\" (UID: \"97e466f4-974e-4d3c-b041-c4d01ad15fb4\") " pod="openshift-marketplace/marketplace-operator-79b997595-cqqsj" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.113125 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97e466f4-974e-4d3c-b041-c4d01ad15fb4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cqqsj\" (UID: \"97e466f4-974e-4d3c-b041-c4d01ad15fb4\") " pod="openshift-marketplace/marketplace-operator-79b997595-cqqsj" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.126744 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/97e466f4-974e-4d3c-b041-c4d01ad15fb4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cqqsj\" (UID: \"97e466f4-974e-4d3c-b041-c4d01ad15fb4\") " pod="openshift-marketplace/marketplace-operator-79b997595-cqqsj" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.129973 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpldg\" (UniqueName: \"kubernetes.io/projected/97e466f4-974e-4d3c-b041-c4d01ad15fb4-kube-api-access-rpldg\") pod \"marketplace-operator-79b997595-cqqsj\" (UID: \"97e466f4-974e-4d3c-b041-c4d01ad15fb4\") " pod="openshift-marketplace/marketplace-operator-79b997595-cqqsj" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.171668 4760 generic.go:334] "Generic (PLEG): container finished" podID="ffeadfaf-9c4c-4dce-99d5-36f0e42571d7" containerID="503d47c18b5bbd7eb35699ca96edf4c3989e91f6e3946b2c68e92e5594d1c2ab" exitCode=0 Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.171755 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52kxf" event={"ID":"ffeadfaf-9c4c-4dce-99d5-36f0e42571d7","Type":"ContainerDied","Data":"503d47c18b5bbd7eb35699ca96edf4c3989e91f6e3946b2c68e92e5594d1c2ab"} Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.171821 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52kxf" event={"ID":"ffeadfaf-9c4c-4dce-99d5-36f0e42571d7","Type":"ContainerDied","Data":"3caeb18152aba8c700b54cb3b3a6c4df897571b6c28e0f3c667f1700af988929"} Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.171836 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3caeb18152aba8c700b54cb3b3a6c4df897571b6c28e0f3c667f1700af988929" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.173827 4760 generic.go:334] "Generic (PLEG): container finished" podID="a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc" containerID="8dd1d07f8be35389c3eed219a69f21128a2506ffc0d49eea49d0075d7f69a8d8" exitCode=0 Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.173882 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjdgw" event={"ID":"a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc","Type":"ContainerDied","Data":"8dd1d07f8be35389c3eed219a69f21128a2506ffc0d49eea49d0075d7f69a8d8"} Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.175158 4760 generic.go:334] "Generic (PLEG): container finished" podID="cc4f6325-5a2a-4f15-8ee9-860617a9d7ce" containerID="52bb5880b1ba1e2a699223268f5ecb0f555d929cf119d904d6704135df89b0ae" exitCode=0 Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.175217 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gsddw" event={"ID":"cc4f6325-5a2a-4f15-8ee9-860617a9d7ce","Type":"ContainerDied","Data":"52bb5880b1ba1e2a699223268f5ecb0f555d929cf119d904d6704135df89b0ae"} Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.176754 4760 generic.go:334] "Generic (PLEG): container finished" podID="36f4e442-c66f-4b6b-b6b3-98c8fd676ba7" containerID="90b088c8681826e2d5b74cf2e1c2733f3398f9126ac83ca553f04a107c6948be" exitCode=0 Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.176810 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dp82r" event={"ID":"36f4e442-c66f-4b6b-b6b3-98c8fd676ba7","Type":"ContainerDied","Data":"90b088c8681826e2d5b74cf2e1c2733f3398f9126ac83ca553f04a107c6948be"} Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.178311 4760 generic.go:334] "Generic (PLEG): container finished" podID="97af4546-9849-4500-a18e-994ec8158af0" containerID="0d990bd19eafbf236c77646298fac63e37e3328e111bc2ed19e935062999d069" exitCode=0 Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.178331 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zj9hk" event={"ID":"97af4546-9849-4500-a18e-994ec8158af0","Type":"ContainerDied","Data":"0d990bd19eafbf236c77646298fac63e37e3328e111bc2ed19e935062999d069"} Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.212846 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cqqsj" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.287853 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52kxf" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.297295 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tjdgw" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.302062 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gsddw" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.383448 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dp82r" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.385981 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zj9hk" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.417900 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cc4f6325-5a2a-4f15-8ee9-860617a9d7ce-marketplace-operator-metrics\") pod \"cc4f6325-5a2a-4f15-8ee9-860617a9d7ce\" (UID: \"cc4f6325-5a2a-4f15-8ee9-860617a9d7ce\") " Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.417960 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffeadfaf-9c4c-4dce-99d5-36f0e42571d7-utilities\") pod \"ffeadfaf-9c4c-4dce-99d5-36f0e42571d7\" (UID: \"ffeadfaf-9c4c-4dce-99d5-36f0e42571d7\") " Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.417986 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgch4\" (UniqueName: \"kubernetes.io/projected/cc4f6325-5a2a-4f15-8ee9-860617a9d7ce-kube-api-access-cgch4\") pod \"cc4f6325-5a2a-4f15-8ee9-860617a9d7ce\" (UID: \"cc4f6325-5a2a-4f15-8ee9-860617a9d7ce\") " Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.418020 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qqp5\" (UniqueName: \"kubernetes.io/projected/ffeadfaf-9c4c-4dce-99d5-36f0e42571d7-kube-api-access-6qqp5\") pod \"ffeadfaf-9c4c-4dce-99d5-36f0e42571d7\" (UID: \"ffeadfaf-9c4c-4dce-99d5-36f0e42571d7\") " Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.418054 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc-utilities\") pod \"a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc\" (UID: \"a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc\") " Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.418102 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cc4f6325-5a2a-4f15-8ee9-860617a9d7ce-marketplace-trusted-ca\") pod \"cc4f6325-5a2a-4f15-8ee9-860617a9d7ce\" (UID: \"cc4f6325-5a2a-4f15-8ee9-860617a9d7ce\") " Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.418141 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc-catalog-content\") pod \"a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc\" (UID: \"a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc\") " Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.418168 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnfp9\" (UniqueName: \"kubernetes.io/projected/a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc-kube-api-access-hnfp9\") pod \"a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc\" (UID: \"a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc\") " Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.418199 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffeadfaf-9c4c-4dce-99d5-36f0e42571d7-catalog-content\") pod \"ffeadfaf-9c4c-4dce-99d5-36f0e42571d7\" (UID: \"ffeadfaf-9c4c-4dce-99d5-36f0e42571d7\") " Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.419318 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc4f6325-5a2a-4f15-8ee9-860617a9d7ce-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "cc4f6325-5a2a-4f15-8ee9-860617a9d7ce" (UID: "cc4f6325-5a2a-4f15-8ee9-860617a9d7ce"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.419449 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffeadfaf-9c4c-4dce-99d5-36f0e42571d7-utilities" (OuterVolumeSpecName: "utilities") pod "ffeadfaf-9c4c-4dce-99d5-36f0e42571d7" (UID: "ffeadfaf-9c4c-4dce-99d5-36f0e42571d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.420174 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc-utilities" (OuterVolumeSpecName: "utilities") pod "a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc" (UID: "a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.422641 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc4f6325-5a2a-4f15-8ee9-860617a9d7ce-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "cc4f6325-5a2a-4f15-8ee9-860617a9d7ce" (UID: "cc4f6325-5a2a-4f15-8ee9-860617a9d7ce"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.422821 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc4f6325-5a2a-4f15-8ee9-860617a9d7ce-kube-api-access-cgch4" (OuterVolumeSpecName: "kube-api-access-cgch4") pod "cc4f6325-5a2a-4f15-8ee9-860617a9d7ce" (UID: "cc4f6325-5a2a-4f15-8ee9-860617a9d7ce"). InnerVolumeSpecName "kube-api-access-cgch4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.423176 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc-kube-api-access-hnfp9" (OuterVolumeSpecName: "kube-api-access-hnfp9") pod "a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc" (UID: "a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc"). InnerVolumeSpecName "kube-api-access-hnfp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.423209 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffeadfaf-9c4c-4dce-99d5-36f0e42571d7-kube-api-access-6qqp5" (OuterVolumeSpecName: "kube-api-access-6qqp5") pod "ffeadfaf-9c4c-4dce-99d5-36f0e42571d7" (UID: "ffeadfaf-9c4c-4dce-99d5-36f0e42571d7"). InnerVolumeSpecName "kube-api-access-6qqp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.470373 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffeadfaf-9c4c-4dce-99d5-36f0e42571d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ffeadfaf-9c4c-4dce-99d5-36f0e42571d7" (UID: "ffeadfaf-9c4c-4dce-99d5-36f0e42571d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.480058 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc" (UID: "a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.519205 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r4bg\" (UniqueName: \"kubernetes.io/projected/36f4e442-c66f-4b6b-b6b3-98c8fd676ba7-kube-api-access-7r4bg\") pod \"36f4e442-c66f-4b6b-b6b3-98c8fd676ba7\" (UID: \"36f4e442-c66f-4b6b-b6b3-98c8fd676ba7\") " Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.519257 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9242t\" (UniqueName: \"kubernetes.io/projected/97af4546-9849-4500-a18e-994ec8158af0-kube-api-access-9242t\") pod \"97af4546-9849-4500-a18e-994ec8158af0\" (UID: \"97af4546-9849-4500-a18e-994ec8158af0\") " Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.519367 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36f4e442-c66f-4b6b-b6b3-98c8fd676ba7-catalog-content\") pod \"36f4e442-c66f-4b6b-b6b3-98c8fd676ba7\" (UID: \"36f4e442-c66f-4b6b-b6b3-98c8fd676ba7\") " Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.519406 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97af4546-9849-4500-a18e-994ec8158af0-utilities\") pod \"97af4546-9849-4500-a18e-994ec8158af0\" (UID: \"97af4546-9849-4500-a18e-994ec8158af0\") " Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.519510 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36f4e442-c66f-4b6b-b6b3-98c8fd676ba7-utilities\") pod \"36f4e442-c66f-4b6b-b6b3-98c8fd676ba7\" (UID: \"36f4e442-c66f-4b6b-b6b3-98c8fd676ba7\") " Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.519559 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97af4546-9849-4500-a18e-994ec8158af0-catalog-content\") pod \"97af4546-9849-4500-a18e-994ec8158af0\" (UID: \"97af4546-9849-4500-a18e-994ec8158af0\") " Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.519734 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.519746 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnfp9\" (UniqueName: \"kubernetes.io/projected/a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc-kube-api-access-hnfp9\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.519757 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffeadfaf-9c4c-4dce-99d5-36f0e42571d7-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.519766 4760 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cc4f6325-5a2a-4f15-8ee9-860617a9d7ce-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.519777 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffeadfaf-9c4c-4dce-99d5-36f0e42571d7-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.519788 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgch4\" (UniqueName: \"kubernetes.io/projected/cc4f6325-5a2a-4f15-8ee9-860617a9d7ce-kube-api-access-cgch4\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.519796 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qqp5\" (UniqueName: \"kubernetes.io/projected/ffeadfaf-9c4c-4dce-99d5-36f0e42571d7-kube-api-access-6qqp5\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.519807 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.519815 4760 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cc4f6325-5a2a-4f15-8ee9-860617a9d7ce-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.520611 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97af4546-9849-4500-a18e-994ec8158af0-utilities" (OuterVolumeSpecName: "utilities") pod "97af4546-9849-4500-a18e-994ec8158af0" (UID: "97af4546-9849-4500-a18e-994ec8158af0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.520694 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36f4e442-c66f-4b6b-b6b3-98c8fd676ba7-utilities" (OuterVolumeSpecName: "utilities") pod "36f4e442-c66f-4b6b-b6b3-98c8fd676ba7" (UID: "36f4e442-c66f-4b6b-b6b3-98c8fd676ba7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.521864 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97af4546-9849-4500-a18e-994ec8158af0-kube-api-access-9242t" (OuterVolumeSpecName: "kube-api-access-9242t") pod "97af4546-9849-4500-a18e-994ec8158af0" (UID: "97af4546-9849-4500-a18e-994ec8158af0"). InnerVolumeSpecName "kube-api-access-9242t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.522390 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36f4e442-c66f-4b6b-b6b3-98c8fd676ba7-kube-api-access-7r4bg" (OuterVolumeSpecName: "kube-api-access-7r4bg") pod "36f4e442-c66f-4b6b-b6b3-98c8fd676ba7" (UID: "36f4e442-c66f-4b6b-b6b3-98c8fd676ba7"). InnerVolumeSpecName "kube-api-access-7r4bg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.536028 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36f4e442-c66f-4b6b-b6b3-98c8fd676ba7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36f4e442-c66f-4b6b-b6b3-98c8fd676ba7" (UID: "36f4e442-c66f-4b6b-b6b3-98c8fd676ba7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.602736 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97af4546-9849-4500-a18e-994ec8158af0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "97af4546-9849-4500-a18e-994ec8158af0" (UID: "97af4546-9849-4500-a18e-994ec8158af0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.620553 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36f4e442-c66f-4b6b-b6b3-98c8fd676ba7-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.620584 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97af4546-9849-4500-a18e-994ec8158af0-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.620596 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7r4bg\" (UniqueName: \"kubernetes.io/projected/36f4e442-c66f-4b6b-b6b3-98c8fd676ba7-kube-api-access-7r4bg\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.620604 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9242t\" (UniqueName: \"kubernetes.io/projected/97af4546-9849-4500-a18e-994ec8158af0-kube-api-access-9242t\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.620613 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36f4e442-c66f-4b6b-b6b3-98c8fd676ba7-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.620621 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97af4546-9849-4500-a18e-994ec8158af0-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:54 crc kubenswrapper[4760]: I0930 07:37:54.641951 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cqqsj"] Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.185251 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gsddw" event={"ID":"cc4f6325-5a2a-4f15-8ee9-860617a9d7ce","Type":"ContainerDied","Data":"5d2e908fa0e01a3196cb3916e635d6049df1ba9e495bb303a23317e343e83e63"} Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.185568 4760 scope.go:117] "RemoveContainer" containerID="52bb5880b1ba1e2a699223268f5ecb0f555d929cf119d904d6704135df89b0ae" Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.185286 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gsddw" Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.189937 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dp82r" event={"ID":"36f4e442-c66f-4b6b-b6b3-98c8fd676ba7","Type":"ContainerDied","Data":"c976640224ed8f3d13b6f420a937e00e348c1a161a0a6a34b5c04f7622a1f039"} Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.190068 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dp82r" Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.193586 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zj9hk" Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.193645 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zj9hk" event={"ID":"97af4546-9849-4500-a18e-994ec8158af0","Type":"ContainerDied","Data":"12ef421c2b85f611b54f842e529425bbc4f7ea1c13cbc68d02242a5c9044d83f"} Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.197516 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjdgw" event={"ID":"a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc","Type":"ContainerDied","Data":"dcc0e7a7a638fe1bb7c2ff6bddb3da6cf762f578d372d30a1bcaffdf99194f71"} Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.197602 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tjdgw" Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.200033 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52kxf" Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.200501 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cqqsj" event={"ID":"97e466f4-974e-4d3c-b041-c4d01ad15fb4","Type":"ContainerStarted","Data":"68df3c67b5e5ff8cfdb488fb11136bfb8512aaa53493769a45584fcfff5af1ed"} Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.200522 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cqqsj" event={"ID":"97e466f4-974e-4d3c-b041-c4d01ad15fb4","Type":"ContainerStarted","Data":"747cbeba87f89211ad907f287a3d5f2d0cb5c77a47078cd9d94bb02880ad4c10"} Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.204013 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-cqqsj" Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.207509 4760 scope.go:117] "RemoveContainer" containerID="90b088c8681826e2d5b74cf2e1c2733f3398f9126ac83ca553f04a107c6948be" Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.211342 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gsddw"] Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.225550 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-cqqsj" Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.226089 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gsddw"] Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.230060 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dp82r"] Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.231713 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dp82r"] Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.238480 4760 scope.go:117] "RemoveContainer" containerID="800a71bc090c3629094e4cb65a311961134bb056219144c4d8202dfe8705a0fd" Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.242878 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-52kxf"] Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.247956 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-52kxf"] Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.253258 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tjdgw"] Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.264221 4760 scope.go:117] "RemoveContainer" containerID="a3ac86061330ee8cb670dbcc161921fa74a3ec58b68857906a49a40e87f3d895" Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.264466 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tjdgw"] Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.267621 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zj9hk"] Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.270564 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zj9hk"] Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.276285 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-cqqsj" podStartSLOduration=2.276271678 podStartE2EDuration="2.276271678s" podCreationTimestamp="2025-09-30 07:37:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:37:55.272619976 +0000 UTC m=+260.915526398" watchObservedRunningTime="2025-09-30 07:37:55.276271678 +0000 UTC m=+260.919178100" Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.288956 4760 scope.go:117] "RemoveContainer" containerID="0d990bd19eafbf236c77646298fac63e37e3328e111bc2ed19e935062999d069" Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.313441 4760 scope.go:117] "RemoveContainer" containerID="3c51d7ef70bb13e88de80fa08e0034ac694e591f9853db2daf5162585be5467c" Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.327287 4760 scope.go:117] "RemoveContainer" containerID="af92c8ab55f3635fe1254979c54172fcd2bd36edbf77421c77b3a223c0e27aa2" Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.338213 4760 scope.go:117] "RemoveContainer" containerID="8dd1d07f8be35389c3eed219a69f21128a2506ffc0d49eea49d0075d7f69a8d8" Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.354743 4760 scope.go:117] "RemoveContainer" containerID="7207c5bbb51eb7ff6fa7ed6241d558cbd698d0ef8e53002ad1a6b4880fad525e" Sep 30 07:37:55 crc kubenswrapper[4760]: I0930 07:37:55.369148 4760 scope.go:117] "RemoveContainer" containerID="86ab3b367ffcddf6e73455becb5c4c38a481f14d3ba6d6f3b256b281a1e160ab" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.052270 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4k7l5"] Sep 30 07:37:56 crc kubenswrapper[4760]: E0930 07:37:56.052747 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36f4e442-c66f-4b6b-b6b3-98c8fd676ba7" containerName="extract-utilities" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.052827 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="36f4e442-c66f-4b6b-b6b3-98c8fd676ba7" containerName="extract-utilities" Sep 30 07:37:56 crc kubenswrapper[4760]: E0930 07:37:56.052903 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffeadfaf-9c4c-4dce-99d5-36f0e42571d7" containerName="extract-utilities" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.052975 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffeadfaf-9c4c-4dce-99d5-36f0e42571d7" containerName="extract-utilities" Sep 30 07:37:56 crc kubenswrapper[4760]: E0930 07:37:56.053045 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97af4546-9849-4500-a18e-994ec8158af0" containerName="registry-server" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.053105 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="97af4546-9849-4500-a18e-994ec8158af0" containerName="registry-server" Sep 30 07:37:56 crc kubenswrapper[4760]: E0930 07:37:56.053179 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36f4e442-c66f-4b6b-b6b3-98c8fd676ba7" containerName="extract-content" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.053251 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="36f4e442-c66f-4b6b-b6b3-98c8fd676ba7" containerName="extract-content" Sep 30 07:37:56 crc kubenswrapper[4760]: E0930 07:37:56.053340 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc" containerName="extract-content" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.053416 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc" containerName="extract-content" Sep 30 07:37:56 crc kubenswrapper[4760]: E0930 07:37:56.053478 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc" containerName="registry-server" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.053543 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc" containerName="registry-server" Sep 30 07:37:56 crc kubenswrapper[4760]: E0930 07:37:56.053622 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffeadfaf-9c4c-4dce-99d5-36f0e42571d7" containerName="registry-server" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.053692 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffeadfaf-9c4c-4dce-99d5-36f0e42571d7" containerName="registry-server" Sep 30 07:37:56 crc kubenswrapper[4760]: E0930 07:37:56.053771 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffeadfaf-9c4c-4dce-99d5-36f0e42571d7" containerName="extract-content" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.053839 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffeadfaf-9c4c-4dce-99d5-36f0e42571d7" containerName="extract-content" Sep 30 07:37:56 crc kubenswrapper[4760]: E0930 07:37:56.053917 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97af4546-9849-4500-a18e-994ec8158af0" containerName="extract-content" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.053990 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="97af4546-9849-4500-a18e-994ec8158af0" containerName="extract-content" Sep 30 07:37:56 crc kubenswrapper[4760]: E0930 07:37:56.054068 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc" containerName="extract-utilities" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.054149 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc" containerName="extract-utilities" Sep 30 07:37:56 crc kubenswrapper[4760]: E0930 07:37:56.054223 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97af4546-9849-4500-a18e-994ec8158af0" containerName="extract-utilities" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.054285 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="97af4546-9849-4500-a18e-994ec8158af0" containerName="extract-utilities" Sep 30 07:37:56 crc kubenswrapper[4760]: E0930 07:37:56.054376 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36f4e442-c66f-4b6b-b6b3-98c8fd676ba7" containerName="registry-server" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.054445 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="36f4e442-c66f-4b6b-b6b3-98c8fd676ba7" containerName="registry-server" Sep 30 07:37:56 crc kubenswrapper[4760]: E0930 07:37:56.054516 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc4f6325-5a2a-4f15-8ee9-860617a9d7ce" containerName="marketplace-operator" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.054653 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc4f6325-5a2a-4f15-8ee9-860617a9d7ce" containerName="marketplace-operator" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.054855 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="36f4e442-c66f-4b6b-b6b3-98c8fd676ba7" containerName="registry-server" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.054937 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="97af4546-9849-4500-a18e-994ec8158af0" containerName="registry-server" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.055005 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc" containerName="registry-server" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.055076 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffeadfaf-9c4c-4dce-99d5-36f0e42571d7" containerName="registry-server" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.055152 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc4f6325-5a2a-4f15-8ee9-860617a9d7ce" containerName="marketplace-operator" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.056089 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4k7l5" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.058543 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.064550 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4k7l5"] Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.241002 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck68d\" (UniqueName: \"kubernetes.io/projected/9b11c132-f36f-49dd-af15-c0d78004c669-kube-api-access-ck68d\") pod \"redhat-marketplace-4k7l5\" (UID: \"9b11c132-f36f-49dd-af15-c0d78004c669\") " pod="openshift-marketplace/redhat-marketplace-4k7l5" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.241059 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b11c132-f36f-49dd-af15-c0d78004c669-utilities\") pod \"redhat-marketplace-4k7l5\" (UID: \"9b11c132-f36f-49dd-af15-c0d78004c669\") " pod="openshift-marketplace/redhat-marketplace-4k7l5" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.241110 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b11c132-f36f-49dd-af15-c0d78004c669-catalog-content\") pod \"redhat-marketplace-4k7l5\" (UID: \"9b11c132-f36f-49dd-af15-c0d78004c669\") " pod="openshift-marketplace/redhat-marketplace-4k7l5" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.255655 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cgrmz"] Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.256952 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cgrmz" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.260760 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.266411 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cgrmz"] Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.342431 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck68d\" (UniqueName: \"kubernetes.io/projected/9b11c132-f36f-49dd-af15-c0d78004c669-kube-api-access-ck68d\") pod \"redhat-marketplace-4k7l5\" (UID: \"9b11c132-f36f-49dd-af15-c0d78004c669\") " pod="openshift-marketplace/redhat-marketplace-4k7l5" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.342524 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b11c132-f36f-49dd-af15-c0d78004c669-utilities\") pod \"redhat-marketplace-4k7l5\" (UID: \"9b11c132-f36f-49dd-af15-c0d78004c669\") " pod="openshift-marketplace/redhat-marketplace-4k7l5" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.342571 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b11c132-f36f-49dd-af15-c0d78004c669-catalog-content\") pod \"redhat-marketplace-4k7l5\" (UID: \"9b11c132-f36f-49dd-af15-c0d78004c669\") " pod="openshift-marketplace/redhat-marketplace-4k7l5" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.343030 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b11c132-f36f-49dd-af15-c0d78004c669-utilities\") pod \"redhat-marketplace-4k7l5\" (UID: \"9b11c132-f36f-49dd-af15-c0d78004c669\") " pod="openshift-marketplace/redhat-marketplace-4k7l5" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.343175 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b11c132-f36f-49dd-af15-c0d78004c669-catalog-content\") pod \"redhat-marketplace-4k7l5\" (UID: \"9b11c132-f36f-49dd-af15-c0d78004c669\") " pod="openshift-marketplace/redhat-marketplace-4k7l5" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.373869 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck68d\" (UniqueName: \"kubernetes.io/projected/9b11c132-f36f-49dd-af15-c0d78004c669-kube-api-access-ck68d\") pod \"redhat-marketplace-4k7l5\" (UID: \"9b11c132-f36f-49dd-af15-c0d78004c669\") " pod="openshift-marketplace/redhat-marketplace-4k7l5" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.388616 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4k7l5" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.443518 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2af6d8e2-56f6-47ec-9ee3-fe00eeb022df-utilities\") pod \"redhat-operators-cgrmz\" (UID: \"2af6d8e2-56f6-47ec-9ee3-fe00eeb022df\") " pod="openshift-marketplace/redhat-operators-cgrmz" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.443578 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gr5q\" (UniqueName: \"kubernetes.io/projected/2af6d8e2-56f6-47ec-9ee3-fe00eeb022df-kube-api-access-4gr5q\") pod \"redhat-operators-cgrmz\" (UID: \"2af6d8e2-56f6-47ec-9ee3-fe00eeb022df\") " pod="openshift-marketplace/redhat-operators-cgrmz" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.443665 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2af6d8e2-56f6-47ec-9ee3-fe00eeb022df-catalog-content\") pod \"redhat-operators-cgrmz\" (UID: \"2af6d8e2-56f6-47ec-9ee3-fe00eeb022df\") " pod="openshift-marketplace/redhat-operators-cgrmz" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.545259 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2af6d8e2-56f6-47ec-9ee3-fe00eeb022df-catalog-content\") pod \"redhat-operators-cgrmz\" (UID: \"2af6d8e2-56f6-47ec-9ee3-fe00eeb022df\") " pod="openshift-marketplace/redhat-operators-cgrmz" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.545381 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2af6d8e2-56f6-47ec-9ee3-fe00eeb022df-utilities\") pod \"redhat-operators-cgrmz\" (UID: \"2af6d8e2-56f6-47ec-9ee3-fe00eeb022df\") " pod="openshift-marketplace/redhat-operators-cgrmz" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.545409 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gr5q\" (UniqueName: \"kubernetes.io/projected/2af6d8e2-56f6-47ec-9ee3-fe00eeb022df-kube-api-access-4gr5q\") pod \"redhat-operators-cgrmz\" (UID: \"2af6d8e2-56f6-47ec-9ee3-fe00eeb022df\") " pod="openshift-marketplace/redhat-operators-cgrmz" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.545954 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2af6d8e2-56f6-47ec-9ee3-fe00eeb022df-utilities\") pod \"redhat-operators-cgrmz\" (UID: \"2af6d8e2-56f6-47ec-9ee3-fe00eeb022df\") " pod="openshift-marketplace/redhat-operators-cgrmz" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.546230 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2af6d8e2-56f6-47ec-9ee3-fe00eeb022df-catalog-content\") pod \"redhat-operators-cgrmz\" (UID: \"2af6d8e2-56f6-47ec-9ee3-fe00eeb022df\") " pod="openshift-marketplace/redhat-operators-cgrmz" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.563408 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gr5q\" (UniqueName: \"kubernetes.io/projected/2af6d8e2-56f6-47ec-9ee3-fe00eeb022df-kube-api-access-4gr5q\") pod \"redhat-operators-cgrmz\" (UID: \"2af6d8e2-56f6-47ec-9ee3-fe00eeb022df\") " pod="openshift-marketplace/redhat-operators-cgrmz" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.577952 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cgrmz" Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.593543 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4k7l5"] Sep 30 07:37:56 crc kubenswrapper[4760]: W0930 07:37:56.601003 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b11c132_f36f_49dd_af15_c0d78004c669.slice/crio-713faa583f05b775f82630fde840dd27d692ca60d182ad3f12ee1c711f8a7676 WatchSource:0}: Error finding container 713faa583f05b775f82630fde840dd27d692ca60d182ad3f12ee1c711f8a7676: Status 404 returned error can't find the container with id 713faa583f05b775f82630fde840dd27d692ca60d182ad3f12ee1c711f8a7676 Sep 30 07:37:56 crc kubenswrapper[4760]: I0930 07:37:56.813740 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cgrmz"] Sep 30 07:37:56 crc kubenswrapper[4760]: W0930 07:37:56.858518 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2af6d8e2_56f6_47ec_9ee3_fe00eeb022df.slice/crio-8318aedf958a89dac028d109857752c2901b956dbf8e013922e0678652211a3c WatchSource:0}: Error finding container 8318aedf958a89dac028d109857752c2901b956dbf8e013922e0678652211a3c: Status 404 returned error can't find the container with id 8318aedf958a89dac028d109857752c2901b956dbf8e013922e0678652211a3c Sep 30 07:37:57 crc kubenswrapper[4760]: I0930 07:37:57.074847 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36f4e442-c66f-4b6b-b6b3-98c8fd676ba7" path="/var/lib/kubelet/pods/36f4e442-c66f-4b6b-b6b3-98c8fd676ba7/volumes" Sep 30 07:37:57 crc kubenswrapper[4760]: I0930 07:37:57.075959 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97af4546-9849-4500-a18e-994ec8158af0" path="/var/lib/kubelet/pods/97af4546-9849-4500-a18e-994ec8158af0/volumes" Sep 30 07:37:57 crc kubenswrapper[4760]: I0930 07:37:57.076954 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc" path="/var/lib/kubelet/pods/a4ec1e48-4ef3-4cfc-9eb2-cd3a0d47e6dc/volumes" Sep 30 07:37:57 crc kubenswrapper[4760]: I0930 07:37:57.078632 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc4f6325-5a2a-4f15-8ee9-860617a9d7ce" path="/var/lib/kubelet/pods/cc4f6325-5a2a-4f15-8ee9-860617a9d7ce/volumes" Sep 30 07:37:57 crc kubenswrapper[4760]: I0930 07:37:57.079226 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffeadfaf-9c4c-4dce-99d5-36f0e42571d7" path="/var/lib/kubelet/pods/ffeadfaf-9c4c-4dce-99d5-36f0e42571d7/volumes" Sep 30 07:37:57 crc kubenswrapper[4760]: I0930 07:37:57.214682 4760 generic.go:334] "Generic (PLEG): container finished" podID="2af6d8e2-56f6-47ec-9ee3-fe00eeb022df" containerID="29a0812669f375c2db99afff03027c85b88c05aa1042ba5f829e696f3f9475e1" exitCode=0 Sep 30 07:37:57 crc kubenswrapper[4760]: I0930 07:37:57.214780 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgrmz" event={"ID":"2af6d8e2-56f6-47ec-9ee3-fe00eeb022df","Type":"ContainerDied","Data":"29a0812669f375c2db99afff03027c85b88c05aa1042ba5f829e696f3f9475e1"} Sep 30 07:37:57 crc kubenswrapper[4760]: I0930 07:37:57.215246 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgrmz" event={"ID":"2af6d8e2-56f6-47ec-9ee3-fe00eeb022df","Type":"ContainerStarted","Data":"8318aedf958a89dac028d109857752c2901b956dbf8e013922e0678652211a3c"} Sep 30 07:37:57 crc kubenswrapper[4760]: I0930 07:37:57.217021 4760 generic.go:334] "Generic (PLEG): container finished" podID="9b11c132-f36f-49dd-af15-c0d78004c669" containerID="3873724cf54173fe01f2154d002a7003808b176855be178d4b94f72a69f0a5d9" exitCode=0 Sep 30 07:37:57 crc kubenswrapper[4760]: I0930 07:37:57.217645 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4k7l5" event={"ID":"9b11c132-f36f-49dd-af15-c0d78004c669","Type":"ContainerDied","Data":"3873724cf54173fe01f2154d002a7003808b176855be178d4b94f72a69f0a5d9"} Sep 30 07:37:57 crc kubenswrapper[4760]: I0930 07:37:57.217681 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4k7l5" event={"ID":"9b11c132-f36f-49dd-af15-c0d78004c669","Type":"ContainerStarted","Data":"713faa583f05b775f82630fde840dd27d692ca60d182ad3f12ee1c711f8a7676"} Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.226696 4760 generic.go:334] "Generic (PLEG): container finished" podID="9b11c132-f36f-49dd-af15-c0d78004c669" containerID="38fccde862523f86fa927cc7cd49c9513993dee900fc7787e8b67cc8ac4a375b" exitCode=0 Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.227006 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4k7l5" event={"ID":"9b11c132-f36f-49dd-af15-c0d78004c669","Type":"ContainerDied","Data":"38fccde862523f86fa927cc7cd49c9513993dee900fc7787e8b67cc8ac4a375b"} Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.456007 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nkbp8"] Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.458483 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nkbp8" Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.460872 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.463783 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nkbp8"] Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.569379 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f9a39d0-3377-49d8-b54f-6cfac198199f-utilities\") pod \"certified-operators-nkbp8\" (UID: \"3f9a39d0-3377-49d8-b54f-6cfac198199f\") " pod="openshift-marketplace/certified-operators-nkbp8" Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.569476 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tddq2\" (UniqueName: \"kubernetes.io/projected/3f9a39d0-3377-49d8-b54f-6cfac198199f-kube-api-access-tddq2\") pod \"certified-operators-nkbp8\" (UID: \"3f9a39d0-3377-49d8-b54f-6cfac198199f\") " pod="openshift-marketplace/certified-operators-nkbp8" Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.569535 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f9a39d0-3377-49d8-b54f-6cfac198199f-catalog-content\") pod \"certified-operators-nkbp8\" (UID: \"3f9a39d0-3377-49d8-b54f-6cfac198199f\") " pod="openshift-marketplace/certified-operators-nkbp8" Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.654561 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-59dbp"] Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.658626 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-59dbp" Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.661402 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.666293 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-59dbp"] Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.670125 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tddq2\" (UniqueName: \"kubernetes.io/projected/3f9a39d0-3377-49d8-b54f-6cfac198199f-kube-api-access-tddq2\") pod \"certified-operators-nkbp8\" (UID: \"3f9a39d0-3377-49d8-b54f-6cfac198199f\") " pod="openshift-marketplace/certified-operators-nkbp8" Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.670185 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f9a39d0-3377-49d8-b54f-6cfac198199f-catalog-content\") pod \"certified-operators-nkbp8\" (UID: \"3f9a39d0-3377-49d8-b54f-6cfac198199f\") " pod="openshift-marketplace/certified-operators-nkbp8" Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.670232 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f9a39d0-3377-49d8-b54f-6cfac198199f-utilities\") pod \"certified-operators-nkbp8\" (UID: \"3f9a39d0-3377-49d8-b54f-6cfac198199f\") " pod="openshift-marketplace/certified-operators-nkbp8" Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.670653 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f9a39d0-3377-49d8-b54f-6cfac198199f-utilities\") pod \"certified-operators-nkbp8\" (UID: \"3f9a39d0-3377-49d8-b54f-6cfac198199f\") " pod="openshift-marketplace/certified-operators-nkbp8" Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.670950 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f9a39d0-3377-49d8-b54f-6cfac198199f-catalog-content\") pod \"certified-operators-nkbp8\" (UID: \"3f9a39d0-3377-49d8-b54f-6cfac198199f\") " pod="openshift-marketplace/certified-operators-nkbp8" Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.702202 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tddq2\" (UniqueName: \"kubernetes.io/projected/3f9a39d0-3377-49d8-b54f-6cfac198199f-kube-api-access-tddq2\") pod \"certified-operators-nkbp8\" (UID: \"3f9a39d0-3377-49d8-b54f-6cfac198199f\") " pod="openshift-marketplace/certified-operators-nkbp8" Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.771160 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh2gl\" (UniqueName: \"kubernetes.io/projected/313bc92e-49cd-492e-939a-af5547c47e72-kube-api-access-qh2gl\") pod \"community-operators-59dbp\" (UID: \"313bc92e-49cd-492e-939a-af5547c47e72\") " pod="openshift-marketplace/community-operators-59dbp" Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.771423 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/313bc92e-49cd-492e-939a-af5547c47e72-utilities\") pod \"community-operators-59dbp\" (UID: \"313bc92e-49cd-492e-939a-af5547c47e72\") " pod="openshift-marketplace/community-operators-59dbp" Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.771491 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/313bc92e-49cd-492e-939a-af5547c47e72-catalog-content\") pod \"community-operators-59dbp\" (UID: \"313bc92e-49cd-492e-939a-af5547c47e72\") " pod="openshift-marketplace/community-operators-59dbp" Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.775379 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nkbp8" Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.873032 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh2gl\" (UniqueName: \"kubernetes.io/projected/313bc92e-49cd-492e-939a-af5547c47e72-kube-api-access-qh2gl\") pod \"community-operators-59dbp\" (UID: \"313bc92e-49cd-492e-939a-af5547c47e72\") " pod="openshift-marketplace/community-operators-59dbp" Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.873169 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/313bc92e-49cd-492e-939a-af5547c47e72-utilities\") pod \"community-operators-59dbp\" (UID: \"313bc92e-49cd-492e-939a-af5547c47e72\") " pod="openshift-marketplace/community-operators-59dbp" Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.873215 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/313bc92e-49cd-492e-939a-af5547c47e72-catalog-content\") pod \"community-operators-59dbp\" (UID: \"313bc92e-49cd-492e-939a-af5547c47e72\") " pod="openshift-marketplace/community-operators-59dbp" Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.874202 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/313bc92e-49cd-492e-939a-af5547c47e72-catalog-content\") pod \"community-operators-59dbp\" (UID: \"313bc92e-49cd-492e-939a-af5547c47e72\") " pod="openshift-marketplace/community-operators-59dbp" Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.874681 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/313bc92e-49cd-492e-939a-af5547c47e72-utilities\") pod \"community-operators-59dbp\" (UID: \"313bc92e-49cd-492e-939a-af5547c47e72\") " pod="openshift-marketplace/community-operators-59dbp" Sep 30 07:37:58 crc kubenswrapper[4760]: I0930 07:37:58.896596 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh2gl\" (UniqueName: \"kubernetes.io/projected/313bc92e-49cd-492e-939a-af5547c47e72-kube-api-access-qh2gl\") pod \"community-operators-59dbp\" (UID: \"313bc92e-49cd-492e-939a-af5547c47e72\") " pod="openshift-marketplace/community-operators-59dbp" Sep 30 07:37:59 crc kubenswrapper[4760]: I0930 07:37:59.043227 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-59dbp" Sep 30 07:37:59 crc kubenswrapper[4760]: I0930 07:37:59.260667 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4k7l5" event={"ID":"9b11c132-f36f-49dd-af15-c0d78004c669","Type":"ContainerStarted","Data":"d5e7cf418b2a903b0c511818ca835c394b8210dc849cbded4e24e65f0bba4b4a"} Sep 30 07:37:59 crc kubenswrapper[4760]: I0930 07:37:59.263590 4760 generic.go:334] "Generic (PLEG): container finished" podID="2af6d8e2-56f6-47ec-9ee3-fe00eeb022df" containerID="591c2b67bc37bc99a75d672749e39aa0d92bab52d997143c9e18ccae07c5f4cc" exitCode=0 Sep 30 07:37:59 crc kubenswrapper[4760]: I0930 07:37:59.263660 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgrmz" event={"ID":"2af6d8e2-56f6-47ec-9ee3-fe00eeb022df","Type":"ContainerDied","Data":"591c2b67bc37bc99a75d672749e39aa0d92bab52d997143c9e18ccae07c5f4cc"} Sep 30 07:37:59 crc kubenswrapper[4760]: I0930 07:37:59.273664 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nkbp8"] Sep 30 07:37:59 crc kubenswrapper[4760]: I0930 07:37:59.286277 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4k7l5" podStartSLOduration=1.70384141 podStartE2EDuration="3.286262216s" podCreationTimestamp="2025-09-30 07:37:56 +0000 UTC" firstStartedPulling="2025-09-30 07:37:57.221655204 +0000 UTC m=+262.864561616" lastFinishedPulling="2025-09-30 07:37:58.80407601 +0000 UTC m=+264.446982422" observedRunningTime="2025-09-30 07:37:59.284141132 +0000 UTC m=+264.927047544" watchObservedRunningTime="2025-09-30 07:37:59.286262216 +0000 UTC m=+264.929168628" Sep 30 07:37:59 crc kubenswrapper[4760]: I0930 07:37:59.429907 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-59dbp"] Sep 30 07:37:59 crc kubenswrapper[4760]: W0930 07:37:59.438905 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod313bc92e_49cd_492e_939a_af5547c47e72.slice/crio-4af6057e7b37ce4966212a35993fab1fbb33388acbb3e536ed3623a0636c6354 WatchSource:0}: Error finding container 4af6057e7b37ce4966212a35993fab1fbb33388acbb3e536ed3623a0636c6354: Status 404 returned error can't find the container with id 4af6057e7b37ce4966212a35993fab1fbb33388acbb3e536ed3623a0636c6354 Sep 30 07:38:00 crc kubenswrapper[4760]: I0930 07:38:00.271431 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgrmz" event={"ID":"2af6d8e2-56f6-47ec-9ee3-fe00eeb022df","Type":"ContainerStarted","Data":"9b222ed2e2b1d8cfd1367adce834c59a588acaccb02e0ea0ed88341371a25f26"} Sep 30 07:38:00 crc kubenswrapper[4760]: I0930 07:38:00.274267 4760 generic.go:334] "Generic (PLEG): container finished" podID="313bc92e-49cd-492e-939a-af5547c47e72" containerID="8b41c3bb402c4574beb6208209dd8060dc00323c8216eaf9f9a62f90d834b9f0" exitCode=0 Sep 30 07:38:00 crc kubenswrapper[4760]: I0930 07:38:00.274364 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-59dbp" event={"ID":"313bc92e-49cd-492e-939a-af5547c47e72","Type":"ContainerDied","Data":"8b41c3bb402c4574beb6208209dd8060dc00323c8216eaf9f9a62f90d834b9f0"} Sep 30 07:38:00 crc kubenswrapper[4760]: I0930 07:38:00.274449 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-59dbp" event={"ID":"313bc92e-49cd-492e-939a-af5547c47e72","Type":"ContainerStarted","Data":"4af6057e7b37ce4966212a35993fab1fbb33388acbb3e536ed3623a0636c6354"} Sep 30 07:38:00 crc kubenswrapper[4760]: I0930 07:38:00.277125 4760 generic.go:334] "Generic (PLEG): container finished" podID="3f9a39d0-3377-49d8-b54f-6cfac198199f" containerID="810f83df65b3d1f8c7ab50ee5c2c91c41a2dd26015d3dda281c2a2bba5c3e807" exitCode=0 Sep 30 07:38:00 crc kubenswrapper[4760]: I0930 07:38:00.277187 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nkbp8" event={"ID":"3f9a39d0-3377-49d8-b54f-6cfac198199f","Type":"ContainerDied","Data":"810f83df65b3d1f8c7ab50ee5c2c91c41a2dd26015d3dda281c2a2bba5c3e807"} Sep 30 07:38:00 crc kubenswrapper[4760]: I0930 07:38:00.277216 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nkbp8" event={"ID":"3f9a39d0-3377-49d8-b54f-6cfac198199f","Type":"ContainerStarted","Data":"e1345f92d23833f98cf4908c83b3e891314d6014ec418404408c2130895af4a4"} Sep 30 07:38:00 crc kubenswrapper[4760]: I0930 07:38:00.289017 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cgrmz" podStartSLOduration=1.7806509240000001 podStartE2EDuration="4.289001103s" podCreationTimestamp="2025-09-30 07:37:56 +0000 UTC" firstStartedPulling="2025-09-30 07:37:57.21641284 +0000 UTC m=+262.859319252" lastFinishedPulling="2025-09-30 07:37:59.724763019 +0000 UTC m=+265.367669431" observedRunningTime="2025-09-30 07:38:00.287787422 +0000 UTC m=+265.930693864" watchObservedRunningTime="2025-09-30 07:38:00.289001103 +0000 UTC m=+265.931907515" Sep 30 07:38:01 crc kubenswrapper[4760]: I0930 07:38:01.286876 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nkbp8" event={"ID":"3f9a39d0-3377-49d8-b54f-6cfac198199f","Type":"ContainerStarted","Data":"f5d2faabfb8f5644ed846db594a4b7c22de5057dbdf5ddd5f4f8cc1f6092f886"} Sep 30 07:38:02 crc kubenswrapper[4760]: I0930 07:38:02.292831 4760 generic.go:334] "Generic (PLEG): container finished" podID="3f9a39d0-3377-49d8-b54f-6cfac198199f" containerID="f5d2faabfb8f5644ed846db594a4b7c22de5057dbdf5ddd5f4f8cc1f6092f886" exitCode=0 Sep 30 07:38:02 crc kubenswrapper[4760]: I0930 07:38:02.292901 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nkbp8" event={"ID":"3f9a39d0-3377-49d8-b54f-6cfac198199f","Type":"ContainerDied","Data":"f5d2faabfb8f5644ed846db594a4b7c22de5057dbdf5ddd5f4f8cc1f6092f886"} Sep 30 07:38:03 crc kubenswrapper[4760]: I0930 07:38:03.300520 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nkbp8" event={"ID":"3f9a39d0-3377-49d8-b54f-6cfac198199f","Type":"ContainerStarted","Data":"d3815d9c73fbcef3fcfc1e583d2bf3a96294ac3b9a742f8125168a26dc6a7f95"} Sep 30 07:38:03 crc kubenswrapper[4760]: I0930 07:38:03.302068 4760 generic.go:334] "Generic (PLEG): container finished" podID="313bc92e-49cd-492e-939a-af5547c47e72" containerID="a4a0a5a865e904ced8f2ccdaed35358509fb83096ba4b2d529fe9267361ca84c" exitCode=0 Sep 30 07:38:03 crc kubenswrapper[4760]: I0930 07:38:03.302132 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-59dbp" event={"ID":"313bc92e-49cd-492e-939a-af5547c47e72","Type":"ContainerDied","Data":"a4a0a5a865e904ced8f2ccdaed35358509fb83096ba4b2d529fe9267361ca84c"} Sep 30 07:38:03 crc kubenswrapper[4760]: I0930 07:38:03.318086 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nkbp8" podStartSLOduration=2.924231496 podStartE2EDuration="5.318069419s" podCreationTimestamp="2025-09-30 07:37:58 +0000 UTC" firstStartedPulling="2025-09-30 07:38:00.278294121 +0000 UTC m=+265.921200563" lastFinishedPulling="2025-09-30 07:38:02.672132034 +0000 UTC m=+268.315038486" observedRunningTime="2025-09-30 07:38:03.316379436 +0000 UTC m=+268.959285858" watchObservedRunningTime="2025-09-30 07:38:03.318069419 +0000 UTC m=+268.960975841" Sep 30 07:38:04 crc kubenswrapper[4760]: I0930 07:38:04.308873 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-59dbp" event={"ID":"313bc92e-49cd-492e-939a-af5547c47e72","Type":"ContainerStarted","Data":"ee46946f94dd863d11ab2ba0ff9e647a4b7c63ae2415f7da4650bce5840d72d5"} Sep 30 07:38:04 crc kubenswrapper[4760]: I0930 07:38:04.324743 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-59dbp" podStartSLOduration=2.937702838 podStartE2EDuration="6.324721016s" podCreationTimestamp="2025-09-30 07:37:58 +0000 UTC" firstStartedPulling="2025-09-30 07:38:00.275313555 +0000 UTC m=+265.918219967" lastFinishedPulling="2025-09-30 07:38:03.662331733 +0000 UTC m=+269.305238145" observedRunningTime="2025-09-30 07:38:04.323644549 +0000 UTC m=+269.966551001" watchObservedRunningTime="2025-09-30 07:38:04.324721016 +0000 UTC m=+269.967627428" Sep 30 07:38:06 crc kubenswrapper[4760]: I0930 07:38:06.389358 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4k7l5" Sep 30 07:38:06 crc kubenswrapper[4760]: I0930 07:38:06.389423 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4k7l5" Sep 30 07:38:06 crc kubenswrapper[4760]: I0930 07:38:06.438164 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4k7l5" Sep 30 07:38:06 crc kubenswrapper[4760]: I0930 07:38:06.578792 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cgrmz" Sep 30 07:38:06 crc kubenswrapper[4760]: I0930 07:38:06.579001 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cgrmz" Sep 30 07:38:06 crc kubenswrapper[4760]: I0930 07:38:06.644166 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cgrmz" Sep 30 07:38:07 crc kubenswrapper[4760]: I0930 07:38:07.359946 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cgrmz" Sep 30 07:38:07 crc kubenswrapper[4760]: I0930 07:38:07.361476 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4k7l5" Sep 30 07:38:08 crc kubenswrapper[4760]: I0930 07:38:08.775952 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nkbp8" Sep 30 07:38:08 crc kubenswrapper[4760]: I0930 07:38:08.776361 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nkbp8" Sep 30 07:38:08 crc kubenswrapper[4760]: I0930 07:38:08.847095 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nkbp8" Sep 30 07:38:09 crc kubenswrapper[4760]: I0930 07:38:09.044085 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-59dbp" Sep 30 07:38:09 crc kubenswrapper[4760]: I0930 07:38:09.044125 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-59dbp" Sep 30 07:38:09 crc kubenswrapper[4760]: I0930 07:38:09.101892 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-59dbp" Sep 30 07:38:09 crc kubenswrapper[4760]: I0930 07:38:09.387771 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nkbp8" Sep 30 07:38:09 crc kubenswrapper[4760]: I0930 07:38:09.388222 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-59dbp" Sep 30 07:39:19 crc kubenswrapper[4760]: I0930 07:39:19.113258 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:39:19 crc kubenswrapper[4760]: I0930 07:39:19.115618 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:39:49 crc kubenswrapper[4760]: I0930 07:39:49.112601 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:39:49 crc kubenswrapper[4760]: I0930 07:39:49.113339 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.443718 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-24r58"] Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.444954 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-24r58" Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.464046 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-24r58"] Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.516582 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/87fd85db-fe31-40c0-8893-180659b5973a-registry-tls\") pod \"image-registry-66df7c8f76-24r58\" (UID: \"87fd85db-fe31-40c0-8893-180659b5973a\") " pod="openshift-image-registry/image-registry-66df7c8f76-24r58" Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.516647 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-24r58\" (UID: \"87fd85db-fe31-40c0-8893-180659b5973a\") " pod="openshift-image-registry/image-registry-66df7c8f76-24r58" Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.516671 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/87fd85db-fe31-40c0-8893-180659b5973a-bound-sa-token\") pod \"image-registry-66df7c8f76-24r58\" (UID: \"87fd85db-fe31-40c0-8893-180659b5973a\") " pod="openshift-image-registry/image-registry-66df7c8f76-24r58" Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.516694 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcb9s\" (UniqueName: \"kubernetes.io/projected/87fd85db-fe31-40c0-8893-180659b5973a-kube-api-access-lcb9s\") pod \"image-registry-66df7c8f76-24r58\" (UID: \"87fd85db-fe31-40c0-8893-180659b5973a\") " pod="openshift-image-registry/image-registry-66df7c8f76-24r58" Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.516721 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/87fd85db-fe31-40c0-8893-180659b5973a-registry-certificates\") pod \"image-registry-66df7c8f76-24r58\" (UID: \"87fd85db-fe31-40c0-8893-180659b5973a\") " pod="openshift-image-registry/image-registry-66df7c8f76-24r58" Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.516742 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/87fd85db-fe31-40c0-8893-180659b5973a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-24r58\" (UID: \"87fd85db-fe31-40c0-8893-180659b5973a\") " pod="openshift-image-registry/image-registry-66df7c8f76-24r58" Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.516758 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/87fd85db-fe31-40c0-8893-180659b5973a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-24r58\" (UID: \"87fd85db-fe31-40c0-8893-180659b5973a\") " pod="openshift-image-registry/image-registry-66df7c8f76-24r58" Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.516775 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/87fd85db-fe31-40c0-8893-180659b5973a-trusted-ca\") pod \"image-registry-66df7c8f76-24r58\" (UID: \"87fd85db-fe31-40c0-8893-180659b5973a\") " pod="openshift-image-registry/image-registry-66df7c8f76-24r58" Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.541162 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-24r58\" (UID: \"87fd85db-fe31-40c0-8893-180659b5973a\") " pod="openshift-image-registry/image-registry-66df7c8f76-24r58" Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.618153 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/87fd85db-fe31-40c0-8893-180659b5973a-registry-certificates\") pod \"image-registry-66df7c8f76-24r58\" (UID: \"87fd85db-fe31-40c0-8893-180659b5973a\") " pod="openshift-image-registry/image-registry-66df7c8f76-24r58" Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.618694 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/87fd85db-fe31-40c0-8893-180659b5973a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-24r58\" (UID: \"87fd85db-fe31-40c0-8893-180659b5973a\") " pod="openshift-image-registry/image-registry-66df7c8f76-24r58" Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.618748 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/87fd85db-fe31-40c0-8893-180659b5973a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-24r58\" (UID: \"87fd85db-fe31-40c0-8893-180659b5973a\") " pod="openshift-image-registry/image-registry-66df7c8f76-24r58" Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.618825 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/87fd85db-fe31-40c0-8893-180659b5973a-trusted-ca\") pod \"image-registry-66df7c8f76-24r58\" (UID: \"87fd85db-fe31-40c0-8893-180659b5973a\") " pod="openshift-image-registry/image-registry-66df7c8f76-24r58" Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.618989 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/87fd85db-fe31-40c0-8893-180659b5973a-registry-tls\") pod \"image-registry-66df7c8f76-24r58\" (UID: \"87fd85db-fe31-40c0-8893-180659b5973a\") " pod="openshift-image-registry/image-registry-66df7c8f76-24r58" Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.619052 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/87fd85db-fe31-40c0-8893-180659b5973a-bound-sa-token\") pod \"image-registry-66df7c8f76-24r58\" (UID: \"87fd85db-fe31-40c0-8893-180659b5973a\") " pod="openshift-image-registry/image-registry-66df7c8f76-24r58" Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.619097 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcb9s\" (UniqueName: \"kubernetes.io/projected/87fd85db-fe31-40c0-8893-180659b5973a-kube-api-access-lcb9s\") pod \"image-registry-66df7c8f76-24r58\" (UID: \"87fd85db-fe31-40c0-8893-180659b5973a\") " pod="openshift-image-registry/image-registry-66df7c8f76-24r58" Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.619839 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/87fd85db-fe31-40c0-8893-180659b5973a-registry-certificates\") pod \"image-registry-66df7c8f76-24r58\" (UID: \"87fd85db-fe31-40c0-8893-180659b5973a\") " pod="openshift-image-registry/image-registry-66df7c8f76-24r58" Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.619964 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/87fd85db-fe31-40c0-8893-180659b5973a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-24r58\" (UID: \"87fd85db-fe31-40c0-8893-180659b5973a\") " pod="openshift-image-registry/image-registry-66df7c8f76-24r58" Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.620280 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/87fd85db-fe31-40c0-8893-180659b5973a-trusted-ca\") pod \"image-registry-66df7c8f76-24r58\" (UID: \"87fd85db-fe31-40c0-8893-180659b5973a\") " pod="openshift-image-registry/image-registry-66df7c8f76-24r58" Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.628341 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/87fd85db-fe31-40c0-8893-180659b5973a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-24r58\" (UID: \"87fd85db-fe31-40c0-8893-180659b5973a\") " pod="openshift-image-registry/image-registry-66df7c8f76-24r58" Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.633063 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/87fd85db-fe31-40c0-8893-180659b5973a-registry-tls\") pod \"image-registry-66df7c8f76-24r58\" (UID: \"87fd85db-fe31-40c0-8893-180659b5973a\") " pod="openshift-image-registry/image-registry-66df7c8f76-24r58" Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.636133 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcb9s\" (UniqueName: \"kubernetes.io/projected/87fd85db-fe31-40c0-8893-180659b5973a-kube-api-access-lcb9s\") pod \"image-registry-66df7c8f76-24r58\" (UID: \"87fd85db-fe31-40c0-8893-180659b5973a\") " pod="openshift-image-registry/image-registry-66df7c8f76-24r58" Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.637602 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/87fd85db-fe31-40c0-8893-180659b5973a-bound-sa-token\") pod \"image-registry-66df7c8f76-24r58\" (UID: \"87fd85db-fe31-40c0-8893-180659b5973a\") " pod="openshift-image-registry/image-registry-66df7c8f76-24r58" Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.762430 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-24r58" Sep 30 07:40:11 crc kubenswrapper[4760]: I0930 07:40:11.994388 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-24r58"] Sep 30 07:40:12 crc kubenswrapper[4760]: I0930 07:40:12.163156 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-24r58" event={"ID":"87fd85db-fe31-40c0-8893-180659b5973a","Type":"ContainerStarted","Data":"7fe69d809bf4298f844ff6a7748a91666943a33f0b28f03550d05e57577138d0"} Sep 30 07:40:12 crc kubenswrapper[4760]: I0930 07:40:12.163491 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-24r58" Sep 30 07:40:12 crc kubenswrapper[4760]: I0930 07:40:12.163506 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-24r58" event={"ID":"87fd85db-fe31-40c0-8893-180659b5973a","Type":"ContainerStarted","Data":"2658656a166a3a88cde340219f7e0957355931791b1016e52bd0b7314fed27ef"} Sep 30 07:40:12 crc kubenswrapper[4760]: I0930 07:40:12.183032 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-24r58" podStartSLOduration=1.1830100510000001 podStartE2EDuration="1.183010051s" podCreationTimestamp="2025-09-30 07:40:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:40:12.179099652 +0000 UTC m=+397.822006064" watchObservedRunningTime="2025-09-30 07:40:12.183010051 +0000 UTC m=+397.825916463" Sep 30 07:40:19 crc kubenswrapper[4760]: I0930 07:40:19.112986 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:40:19 crc kubenswrapper[4760]: I0930 07:40:19.113685 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:40:19 crc kubenswrapper[4760]: I0930 07:40:19.113771 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 07:40:19 crc kubenswrapper[4760]: I0930 07:40:19.114563 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"316186f24b90a3f80ce14b2c1f47627d59bb457c83e7ed3a00cd62894b2b866d"} pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 07:40:19 crc kubenswrapper[4760]: I0930 07:40:19.114644 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" containerID="cri-o://316186f24b90a3f80ce14b2c1f47627d59bb457c83e7ed3a00cd62894b2b866d" gracePeriod=600 Sep 30 07:40:20 crc kubenswrapper[4760]: I0930 07:40:20.217380 4760 generic.go:334] "Generic (PLEG): container finished" podID="7a9c8270-6964-4886-87d0-227b05b76da4" containerID="316186f24b90a3f80ce14b2c1f47627d59bb457c83e7ed3a00cd62894b2b866d" exitCode=0 Sep 30 07:40:20 crc kubenswrapper[4760]: I0930 07:40:20.217451 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerDied","Data":"316186f24b90a3f80ce14b2c1f47627d59bb457c83e7ed3a00cd62894b2b866d"} Sep 30 07:40:20 crc kubenswrapper[4760]: I0930 07:40:20.217817 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerStarted","Data":"c14c5e22dc0508a193bdba7225efcdfbf417d8b9976aacad55c5d22c10bc7a92"} Sep 30 07:40:20 crc kubenswrapper[4760]: I0930 07:40:20.217843 4760 scope.go:117] "RemoveContainer" containerID="90d669a96085f6fb9b5c9507ff1d5c960bdeafa1d01d7c115b89788e14e155d7" Sep 30 07:40:31 crc kubenswrapper[4760]: I0930 07:40:31.770888 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-24r58" Sep 30 07:40:31 crc kubenswrapper[4760]: I0930 07:40:31.849704 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vgrsq"] Sep 30 07:40:56 crc kubenswrapper[4760]: I0930 07:40:56.891774 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" podUID="cd4422f1-0405-44b0-9256-fec03b6dc2f0" containerName="registry" containerID="cri-o://821a3622ecd7a40223be57e113504644b686b8fd664caaf9cea9bfa727166a49" gracePeriod=30 Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.355487 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.462528 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5r4xr\" (UniqueName: \"kubernetes.io/projected/cd4422f1-0405-44b0-9256-fec03b6dc2f0-kube-api-access-5r4xr\") pod \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.462617 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cd4422f1-0405-44b0-9256-fec03b6dc2f0-registry-tls\") pod \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.463026 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.463076 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cd4422f1-0405-44b0-9256-fec03b6dc2f0-ca-trust-extracted\") pod \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.463129 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cd4422f1-0405-44b0-9256-fec03b6dc2f0-installation-pull-secrets\") pod \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.463196 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cd4422f1-0405-44b0-9256-fec03b6dc2f0-registry-certificates\") pod \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.463237 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd4422f1-0405-44b0-9256-fec03b6dc2f0-trusted-ca\") pod \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.463272 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd4422f1-0405-44b0-9256-fec03b6dc2f0-bound-sa-token\") pod \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\" (UID: \"cd4422f1-0405-44b0-9256-fec03b6dc2f0\") " Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.464796 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd4422f1-0405-44b0-9256-fec03b6dc2f0-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "cd4422f1-0405-44b0-9256-fec03b6dc2f0" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.464826 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd4422f1-0405-44b0-9256-fec03b6dc2f0-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "cd4422f1-0405-44b0-9256-fec03b6dc2f0" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.477408 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd4422f1-0405-44b0-9256-fec03b6dc2f0-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "cd4422f1-0405-44b0-9256-fec03b6dc2f0" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.477441 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd4422f1-0405-44b0-9256-fec03b6dc2f0-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "cd4422f1-0405-44b0-9256-fec03b6dc2f0" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.477458 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd4422f1-0405-44b0-9256-fec03b6dc2f0-kube-api-access-5r4xr" (OuterVolumeSpecName: "kube-api-access-5r4xr") pod "cd4422f1-0405-44b0-9256-fec03b6dc2f0" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0"). InnerVolumeSpecName "kube-api-access-5r4xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.477751 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd4422f1-0405-44b0-9256-fec03b6dc2f0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "cd4422f1-0405-44b0-9256-fec03b6dc2f0" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.483156 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "cd4422f1-0405-44b0-9256-fec03b6dc2f0" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.489192 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd4422f1-0405-44b0-9256-fec03b6dc2f0-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "cd4422f1-0405-44b0-9256-fec03b6dc2f0" (UID: "cd4422f1-0405-44b0-9256-fec03b6dc2f0"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.490646 4760 generic.go:334] "Generic (PLEG): container finished" podID="cd4422f1-0405-44b0-9256-fec03b6dc2f0" containerID="821a3622ecd7a40223be57e113504644b686b8fd664caaf9cea9bfa727166a49" exitCode=0 Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.490716 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.490716 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" event={"ID":"cd4422f1-0405-44b0-9256-fec03b6dc2f0","Type":"ContainerDied","Data":"821a3622ecd7a40223be57e113504644b686b8fd664caaf9cea9bfa727166a49"} Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.490868 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vgrsq" event={"ID":"cd4422f1-0405-44b0-9256-fec03b6dc2f0","Type":"ContainerDied","Data":"7129f2f7b0843c528b48a3ea2f621f159c41dd2f894ff0b1002cfe8804b89434"} Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.490911 4760 scope.go:117] "RemoveContainer" containerID="821a3622ecd7a40223be57e113504644b686b8fd664caaf9cea9bfa727166a49" Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.542705 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vgrsq"] Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.545562 4760 scope.go:117] "RemoveContainer" containerID="821a3622ecd7a40223be57e113504644b686b8fd664caaf9cea9bfa727166a49" Sep 30 07:40:57 crc kubenswrapper[4760]: E0930 07:40:57.546385 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"821a3622ecd7a40223be57e113504644b686b8fd664caaf9cea9bfa727166a49\": container with ID starting with 821a3622ecd7a40223be57e113504644b686b8fd664caaf9cea9bfa727166a49 not found: ID does not exist" containerID="821a3622ecd7a40223be57e113504644b686b8fd664caaf9cea9bfa727166a49" Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.546450 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"821a3622ecd7a40223be57e113504644b686b8fd664caaf9cea9bfa727166a49"} err="failed to get container status \"821a3622ecd7a40223be57e113504644b686b8fd664caaf9cea9bfa727166a49\": rpc error: code = NotFound desc = could not find container \"821a3622ecd7a40223be57e113504644b686b8fd664caaf9cea9bfa727166a49\": container with ID starting with 821a3622ecd7a40223be57e113504644b686b8fd664caaf9cea9bfa727166a49 not found: ID does not exist" Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.549374 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vgrsq"] Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.565936 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5r4xr\" (UniqueName: \"kubernetes.io/projected/cd4422f1-0405-44b0-9256-fec03b6dc2f0-kube-api-access-5r4xr\") on node \"crc\" DevicePath \"\"" Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.565967 4760 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cd4422f1-0405-44b0-9256-fec03b6dc2f0-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.566013 4760 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cd4422f1-0405-44b0-9256-fec03b6dc2f0-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.566027 4760 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cd4422f1-0405-44b0-9256-fec03b6dc2f0-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.566041 4760 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cd4422f1-0405-44b0-9256-fec03b6dc2f0-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.566053 4760 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd4422f1-0405-44b0-9256-fec03b6dc2f0-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 07:40:57 crc kubenswrapper[4760]: I0930 07:40:57.566063 4760 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd4422f1-0405-44b0-9256-fec03b6dc2f0-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 07:40:59 crc kubenswrapper[4760]: I0930 07:40:59.086966 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd4422f1-0405-44b0-9256-fec03b6dc2f0" path="/var/lib/kubelet/pods/cd4422f1-0405-44b0-9256-fec03b6dc2f0/volumes" Sep 30 07:42:19 crc kubenswrapper[4760]: I0930 07:42:19.112742 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:42:19 crc kubenswrapper[4760]: I0930 07:42:19.113435 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:42:35 crc kubenswrapper[4760]: I0930 07:42:35.281105 4760 scope.go:117] "RemoveContainer" containerID="391940cdfb659af1e0e3bf92ae30609a677b891832f2bfdbc81853bde0e1d645" Sep 30 07:42:35 crc kubenswrapper[4760]: I0930 07:42:35.312518 4760 scope.go:117] "RemoveContainer" containerID="503d47c18b5bbd7eb35699ca96edf4c3989e91f6e3946b2c68e92e5594d1c2ab" Sep 30 07:42:35 crc kubenswrapper[4760]: I0930 07:42:35.358357 4760 scope.go:117] "RemoveContainer" containerID="10ade4ac36cd7b622e165a9e5f47a830f2068f65caec2b544f4f87c9a33f7476" Sep 30 07:42:49 crc kubenswrapper[4760]: I0930 07:42:49.113556 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:42:49 crc kubenswrapper[4760]: I0930 07:42:49.114558 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.072965 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-9kkx2"] Sep 30 07:43:01 crc kubenswrapper[4760]: E0930 07:43:01.073727 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd4422f1-0405-44b0-9256-fec03b6dc2f0" containerName="registry" Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.073739 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd4422f1-0405-44b0-9256-fec03b6dc2f0" containerName="registry" Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.073838 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd4422f1-0405-44b0-9256-fec03b6dc2f0" containerName="registry" Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.074175 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-9kkx2" Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.075917 4760 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-z55px" Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.077030 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.077441 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.081000 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-9kkx2"] Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.088178 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-7f49c"] Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.088811 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-7f49c" Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.092383 4760 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-ppbzk" Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.106367 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnkv9\" (UniqueName: \"kubernetes.io/projected/6c1a0dc3-5f08-4216-99c7-ef1889df0775-kube-api-access-gnkv9\") pod \"cert-manager-5b446d88c5-7f49c\" (UID: \"6c1a0dc3-5f08-4216-99c7-ef1889df0775\") " pod="cert-manager/cert-manager-5b446d88c5-7f49c" Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.106468 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdk7j\" (UniqueName: \"kubernetes.io/projected/59941c26-7746-44e9-8453-21d64dbdb91b-kube-api-access-hdk7j\") pod \"cert-manager-cainjector-7f985d654d-9kkx2\" (UID: \"59941c26-7746-44e9-8453-21d64dbdb91b\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-9kkx2" Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.116631 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-7f49c"] Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.122416 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-zptcg"] Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.123048 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-zptcg" Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.129643 4760 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-4qjsd" Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.133172 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-zptcg"] Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.208065 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdk7j\" (UniqueName: \"kubernetes.io/projected/59941c26-7746-44e9-8453-21d64dbdb91b-kube-api-access-hdk7j\") pod \"cert-manager-cainjector-7f985d654d-9kkx2\" (UID: \"59941c26-7746-44e9-8453-21d64dbdb91b\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-9kkx2" Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.208141 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtk7d\" (UniqueName: \"kubernetes.io/projected/49a05254-e89a-4b7c-b128-0a50daab0f7d-kube-api-access-mtk7d\") pod \"cert-manager-webhook-5655c58dd6-zptcg\" (UID: \"49a05254-e89a-4b7c-b128-0a50daab0f7d\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-zptcg" Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.208187 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnkv9\" (UniqueName: \"kubernetes.io/projected/6c1a0dc3-5f08-4216-99c7-ef1889df0775-kube-api-access-gnkv9\") pod \"cert-manager-5b446d88c5-7f49c\" (UID: \"6c1a0dc3-5f08-4216-99c7-ef1889df0775\") " pod="cert-manager/cert-manager-5b446d88c5-7f49c" Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.226153 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdk7j\" (UniqueName: \"kubernetes.io/projected/59941c26-7746-44e9-8453-21d64dbdb91b-kube-api-access-hdk7j\") pod \"cert-manager-cainjector-7f985d654d-9kkx2\" (UID: \"59941c26-7746-44e9-8453-21d64dbdb91b\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-9kkx2" Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.226272 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnkv9\" (UniqueName: \"kubernetes.io/projected/6c1a0dc3-5f08-4216-99c7-ef1889df0775-kube-api-access-gnkv9\") pod \"cert-manager-5b446d88c5-7f49c\" (UID: \"6c1a0dc3-5f08-4216-99c7-ef1889df0775\") " pod="cert-manager/cert-manager-5b446d88c5-7f49c" Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.309109 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtk7d\" (UniqueName: \"kubernetes.io/projected/49a05254-e89a-4b7c-b128-0a50daab0f7d-kube-api-access-mtk7d\") pod \"cert-manager-webhook-5655c58dd6-zptcg\" (UID: \"49a05254-e89a-4b7c-b128-0a50daab0f7d\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-zptcg" Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.331378 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtk7d\" (UniqueName: \"kubernetes.io/projected/49a05254-e89a-4b7c-b128-0a50daab0f7d-kube-api-access-mtk7d\") pod \"cert-manager-webhook-5655c58dd6-zptcg\" (UID: \"49a05254-e89a-4b7c-b128-0a50daab0f7d\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-zptcg" Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.412692 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-9kkx2" Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.422041 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-7f49c" Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.441485 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-zptcg" Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.629074 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-9kkx2"] Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.636263 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.673900 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-7f49c"] Sep 30 07:43:01 crc kubenswrapper[4760]: W0930 07:43:01.677067 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c1a0dc3_5f08_4216_99c7_ef1889df0775.slice/crio-3e9a0abfb903cbf62b27882f6c339231b30338c4934d43d96b198b366be293c0 WatchSource:0}: Error finding container 3e9a0abfb903cbf62b27882f6c339231b30338c4934d43d96b198b366be293c0: Status 404 returned error can't find the container with id 3e9a0abfb903cbf62b27882f6c339231b30338c4934d43d96b198b366be293c0 Sep 30 07:43:01 crc kubenswrapper[4760]: I0930 07:43:01.702743 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-zptcg"] Sep 30 07:43:02 crc kubenswrapper[4760]: I0930 07:43:02.350655 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-7f49c" event={"ID":"6c1a0dc3-5f08-4216-99c7-ef1889df0775","Type":"ContainerStarted","Data":"3e9a0abfb903cbf62b27882f6c339231b30338c4934d43d96b198b366be293c0"} Sep 30 07:43:02 crc kubenswrapper[4760]: I0930 07:43:02.351732 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-9kkx2" event={"ID":"59941c26-7746-44e9-8453-21d64dbdb91b","Type":"ContainerStarted","Data":"8bf9e090c6078977542dae4317ef01262d8a309e4a858ec63b4213a8045b51e5"} Sep 30 07:43:02 crc kubenswrapper[4760]: I0930 07:43:02.355723 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-zptcg" event={"ID":"49a05254-e89a-4b7c-b128-0a50daab0f7d","Type":"ContainerStarted","Data":"5e16d12d9a757a7361fa81a36db7946a888500a2d5a097ac2db2892f101982a2"} Sep 30 07:43:05 crc kubenswrapper[4760]: I0930 07:43:05.375419 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-7f49c" event={"ID":"6c1a0dc3-5f08-4216-99c7-ef1889df0775","Type":"ContainerStarted","Data":"b85b1832caa085f538f69e9174b0d5ab8bd89aed381d3e36162ff260f263f47c"} Sep 30 07:43:05 crc kubenswrapper[4760]: I0930 07:43:05.379982 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-9kkx2" event={"ID":"59941c26-7746-44e9-8453-21d64dbdb91b","Type":"ContainerStarted","Data":"f2986bbae12dfd00ef63d63894f59e20f9844f44777796b8a288c8def057573f"} Sep 30 07:43:05 crc kubenswrapper[4760]: I0930 07:43:05.382124 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-zptcg" event={"ID":"49a05254-e89a-4b7c-b128-0a50daab0f7d","Type":"ContainerStarted","Data":"54984cf5c414891af9e3823aeb0845b5e393cea871075be5e503e856fdc95b6c"} Sep 30 07:43:05 crc kubenswrapper[4760]: I0930 07:43:05.382277 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-zptcg" Sep 30 07:43:05 crc kubenswrapper[4760]: I0930 07:43:05.393741 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-7f49c" podStartSLOduration=0.933654141 podStartE2EDuration="4.393710934s" podCreationTimestamp="2025-09-30 07:43:01 +0000 UTC" firstStartedPulling="2025-09-30 07:43:01.679056972 +0000 UTC m=+567.321963384" lastFinishedPulling="2025-09-30 07:43:05.139113745 +0000 UTC m=+570.782020177" observedRunningTime="2025-09-30 07:43:05.393579351 +0000 UTC m=+571.036485773" watchObservedRunningTime="2025-09-30 07:43:05.393710934 +0000 UTC m=+571.036617356" Sep 30 07:43:05 crc kubenswrapper[4760]: I0930 07:43:05.418356 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-9kkx2" podStartSLOduration=0.916476443 podStartE2EDuration="4.418335534s" podCreationTimestamp="2025-09-30 07:43:01 +0000 UTC" firstStartedPulling="2025-09-30 07:43:01.636070303 +0000 UTC m=+567.278976715" lastFinishedPulling="2025-09-30 07:43:05.137929394 +0000 UTC m=+570.780835806" observedRunningTime="2025-09-30 07:43:05.410324689 +0000 UTC m=+571.053231121" watchObservedRunningTime="2025-09-30 07:43:05.418335534 +0000 UTC m=+571.061241956" Sep 30 07:43:05 crc kubenswrapper[4760]: I0930 07:43:05.435529 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-zptcg" podStartSLOduration=0.926073088 podStartE2EDuration="4.435509073s" podCreationTimestamp="2025-09-30 07:43:01 +0000 UTC" firstStartedPulling="2025-09-30 07:43:01.70869763 +0000 UTC m=+567.351604032" lastFinishedPulling="2025-09-30 07:43:05.218133595 +0000 UTC m=+570.861040017" observedRunningTime="2025-09-30 07:43:05.43502902 +0000 UTC m=+571.077935442" watchObservedRunningTime="2025-09-30 07:43:05.435509073 +0000 UTC m=+571.078415495" Sep 30 07:43:11 crc kubenswrapper[4760]: I0930 07:43:11.446341 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-zptcg" Sep 30 07:43:11 crc kubenswrapper[4760]: I0930 07:43:11.949266 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sspvl"] Sep 30 07:43:11 crc kubenswrapper[4760]: I0930 07:43:11.950024 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="ovn-controller" containerID="cri-o://ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185" gracePeriod=30 Sep 30 07:43:11 crc kubenswrapper[4760]: I0930 07:43:11.950158 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379" gracePeriod=30 Sep 30 07:43:11 crc kubenswrapper[4760]: I0930 07:43:11.950127 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="nbdb" containerID="cri-o://0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9" gracePeriod=30 Sep 30 07:43:11 crc kubenswrapper[4760]: I0930 07:43:11.950233 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="kube-rbac-proxy-node" containerID="cri-o://04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5" gracePeriod=30 Sep 30 07:43:11 crc kubenswrapper[4760]: I0930 07:43:11.950309 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="northd" containerID="cri-o://bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e" gracePeriod=30 Sep 30 07:43:11 crc kubenswrapper[4760]: I0930 07:43:11.950362 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="sbdb" containerID="cri-o://6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0" gracePeriod=30 Sep 30 07:43:11 crc kubenswrapper[4760]: I0930 07:43:11.950439 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="ovn-acl-logging" containerID="cri-o://45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45" gracePeriod=30 Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.008209 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="ovnkube-controller" containerID="cri-o://3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce" gracePeriod=30 Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.298039 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sspvl_2c4ca8ea-a714-40e5-9e10-080aef32237b/ovnkube-controller/3.log" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.301182 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sspvl_2c4ca8ea-a714-40e5-9e10-080aef32237b/ovn-acl-logging/0.log" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.301755 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sspvl_2c4ca8ea-a714-40e5-9e10-080aef32237b/ovn-controller/0.log" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.302387 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.359059 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-kubelet\") pod \"2c4ca8ea-a714-40e5-9e10-080aef32237b\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.359137 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-log-socket\") pod \"2c4ca8ea-a714-40e5-9e10-080aef32237b\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.359197 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c4ca8ea-a714-40e5-9e10-080aef32237b-ovnkube-config\") pod \"2c4ca8ea-a714-40e5-9e10-080aef32237b\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.359245 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-cni-bin\") pod \"2c4ca8ea-a714-40e5-9e10-080aef32237b\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.359286 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c4ca8ea-a714-40e5-9e10-080aef32237b-ovn-node-metrics-cert\") pod \"2c4ca8ea-a714-40e5-9e10-080aef32237b\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.359288 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "2c4ca8ea-a714-40e5-9e10-080aef32237b" (UID: "2c4ca8ea-a714-40e5-9e10-080aef32237b"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.359353 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-etc-openvswitch\") pod \"2c4ca8ea-a714-40e5-9e10-080aef32237b\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.359404 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-run-systemd\") pod \"2c4ca8ea-a714-40e5-9e10-080aef32237b\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.359457 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-run-openvswitch\") pod \"2c4ca8ea-a714-40e5-9e10-080aef32237b\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.359503 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2c4ca8ea-a714-40e5-9e10-080aef32237b-ovnkube-script-lib\") pod \"2c4ca8ea-a714-40e5-9e10-080aef32237b\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.359537 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-run-netns\") pod \"2c4ca8ea-a714-40e5-9e10-080aef32237b\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.359573 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c4ca8ea-a714-40e5-9e10-080aef32237b-env-overrides\") pod \"2c4ca8ea-a714-40e5-9e10-080aef32237b\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.359606 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-cni-netd\") pod \"2c4ca8ea-a714-40e5-9e10-080aef32237b\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.359665 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-run-ovn-kubernetes\") pod \"2c4ca8ea-a714-40e5-9e10-080aef32237b\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.359723 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-var-lib-openvswitch\") pod \"2c4ca8ea-a714-40e5-9e10-080aef32237b\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.359753 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-node-log\") pod \"2c4ca8ea-a714-40e5-9e10-080aef32237b\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.359758 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-log-socket" (OuterVolumeSpecName: "log-socket") pod "2c4ca8ea-a714-40e5-9e10-080aef32237b" (UID: "2c4ca8ea-a714-40e5-9e10-080aef32237b"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.359784 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-slash\") pod \"2c4ca8ea-a714-40e5-9e10-080aef32237b\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.359831 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-run-ovn\") pod \"2c4ca8ea-a714-40e5-9e10-080aef32237b\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.359868 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"2c4ca8ea-a714-40e5-9e10-080aef32237b\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.359897 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "2c4ca8ea-a714-40e5-9e10-080aef32237b" (UID: "2c4ca8ea-a714-40e5-9e10-080aef32237b"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.359929 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lllfc\" (UniqueName: \"kubernetes.io/projected/2c4ca8ea-a714-40e5-9e10-080aef32237b-kube-api-access-lllfc\") pod \"2c4ca8ea-a714-40e5-9e10-080aef32237b\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.359947 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "2c4ca8ea-a714-40e5-9e10-080aef32237b" (UID: "2c4ca8ea-a714-40e5-9e10-080aef32237b"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.359971 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-systemd-units\") pod \"2c4ca8ea-a714-40e5-9e10-080aef32237b\" (UID: \"2c4ca8ea-a714-40e5-9e10-080aef32237b\") " Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.360280 4760 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-run-netns\") on node \"crc\" DevicePath \"\"" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.360392 4760 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.360433 4760 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-kubelet\") on node \"crc\" DevicePath \"\"" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.360456 4760 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-log-socket\") on node \"crc\" DevicePath \"\"" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.359976 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "2c4ca8ea-a714-40e5-9e10-080aef32237b" (UID: "2c4ca8ea-a714-40e5-9e10-080aef32237b"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.359981 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "2c4ca8ea-a714-40e5-9e10-080aef32237b" (UID: "2c4ca8ea-a714-40e5-9e10-080aef32237b"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.360015 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c4ca8ea-a714-40e5-9e10-080aef32237b-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "2c4ca8ea-a714-40e5-9e10-080aef32237b" (UID: "2c4ca8ea-a714-40e5-9e10-080aef32237b"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.360492 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c4ca8ea-a714-40e5-9e10-080aef32237b-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "2c4ca8ea-a714-40e5-9e10-080aef32237b" (UID: "2c4ca8ea-a714-40e5-9e10-080aef32237b"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.360033 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "2c4ca8ea-a714-40e5-9e10-080aef32237b" (UID: "2c4ca8ea-a714-40e5-9e10-080aef32237b"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.360072 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "2c4ca8ea-a714-40e5-9e10-080aef32237b" (UID: "2c4ca8ea-a714-40e5-9e10-080aef32237b"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.360532 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c4ca8ea-a714-40e5-9e10-080aef32237b-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "2c4ca8ea-a714-40e5-9e10-080aef32237b" (UID: "2c4ca8ea-a714-40e5-9e10-080aef32237b"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.360097 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "2c4ca8ea-a714-40e5-9e10-080aef32237b" (UID: "2c4ca8ea-a714-40e5-9e10-080aef32237b"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.360115 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-node-log" (OuterVolumeSpecName: "node-log") pod "2c4ca8ea-a714-40e5-9e10-080aef32237b" (UID: "2c4ca8ea-a714-40e5-9e10-080aef32237b"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.360127 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-slash" (OuterVolumeSpecName: "host-slash") pod "2c4ca8ea-a714-40e5-9e10-080aef32237b" (UID: "2c4ca8ea-a714-40e5-9e10-080aef32237b"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.360144 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "2c4ca8ea-a714-40e5-9e10-080aef32237b" (UID: "2c4ca8ea-a714-40e5-9e10-080aef32237b"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.360188 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "2c4ca8ea-a714-40e5-9e10-080aef32237b" (UID: "2c4ca8ea-a714-40e5-9e10-080aef32237b"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.360229 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "2c4ca8ea-a714-40e5-9e10-080aef32237b" (UID: "2c4ca8ea-a714-40e5-9e10-080aef32237b"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.369150 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c4ca8ea-a714-40e5-9e10-080aef32237b-kube-api-access-lllfc" (OuterVolumeSpecName: "kube-api-access-lllfc") pod "2c4ca8ea-a714-40e5-9e10-080aef32237b" (UID: "2c4ca8ea-a714-40e5-9e10-080aef32237b"). InnerVolumeSpecName "kube-api-access-lllfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.370652 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c4ca8ea-a714-40e5-9e10-080aef32237b-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "2c4ca8ea-a714-40e5-9e10-080aef32237b" (UID: "2c4ca8ea-a714-40e5-9e10-080aef32237b"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.372844 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-blxw5"] Sep 30 07:43:12 crc kubenswrapper[4760]: E0930 07:43:12.373068 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="ovnkube-controller" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.373096 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="ovnkube-controller" Sep 30 07:43:12 crc kubenswrapper[4760]: E0930 07:43:12.373120 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.373128 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 07:43:12 crc kubenswrapper[4760]: E0930 07:43:12.373142 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="ovn-acl-logging" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.373150 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="ovn-acl-logging" Sep 30 07:43:12 crc kubenswrapper[4760]: E0930 07:43:12.373163 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="nbdb" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.373172 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="nbdb" Sep 30 07:43:12 crc kubenswrapper[4760]: E0930 07:43:12.373184 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="kube-rbac-proxy-node" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.373192 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="kube-rbac-proxy-node" Sep 30 07:43:12 crc kubenswrapper[4760]: E0930 07:43:12.373205 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="ovn-controller" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.373213 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="ovn-controller" Sep 30 07:43:12 crc kubenswrapper[4760]: E0930 07:43:12.373222 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="ovnkube-controller" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.373229 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="ovnkube-controller" Sep 30 07:43:12 crc kubenswrapper[4760]: E0930 07:43:12.373246 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="northd" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.373272 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="northd" Sep 30 07:43:12 crc kubenswrapper[4760]: E0930 07:43:12.373285 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="sbdb" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.373294 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="sbdb" Sep 30 07:43:12 crc kubenswrapper[4760]: E0930 07:43:12.373302 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="ovnkube-controller" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.373332 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="ovnkube-controller" Sep 30 07:43:12 crc kubenswrapper[4760]: E0930 07:43:12.373341 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="kubecfg-setup" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.373351 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="kubecfg-setup" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.373465 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="ovn-controller" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.373477 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="ovnkube-controller" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.373487 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="ovnkube-controller" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.373496 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="sbdb" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.373509 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="ovnkube-controller" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.373521 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="ovnkube-controller" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.373531 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="kube-rbac-proxy-node" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.373543 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="northd" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.373556 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="nbdb" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.373564 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="ovn-acl-logging" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.373573 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 07:43:12 crc kubenswrapper[4760]: E0930 07:43:12.373699 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="ovnkube-controller" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.373710 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="ovnkube-controller" Sep 30 07:43:12 crc kubenswrapper[4760]: E0930 07:43:12.373721 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="ovnkube-controller" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.373729 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="ovnkube-controller" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.373846 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerName="ovnkube-controller" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.377773 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.386975 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "2c4ca8ea-a714-40e5-9e10-080aef32237b" (UID: "2c4ca8ea-a714-40e5-9e10-080aef32237b"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.428027 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sspvl_2c4ca8ea-a714-40e5-9e10-080aef32237b/ovnkube-controller/3.log" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.430114 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sspvl_2c4ca8ea-a714-40e5-9e10-080aef32237b/ovn-acl-logging/0.log" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.430617 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sspvl_2c4ca8ea-a714-40e5-9e10-080aef32237b/ovn-controller/0.log" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431117 4760 generic.go:334] "Generic (PLEG): container finished" podID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerID="3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce" exitCode=0 Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431215 4760 generic.go:334] "Generic (PLEG): container finished" podID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerID="6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0" exitCode=0 Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431263 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431267 4760 generic.go:334] "Generic (PLEG): container finished" podID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerID="0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9" exitCode=0 Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431373 4760 generic.go:334] "Generic (PLEG): container finished" podID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerID="bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e" exitCode=0 Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431409 4760 generic.go:334] "Generic (PLEG): container finished" podID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerID="9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379" exitCode=0 Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431185 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" event={"ID":"2c4ca8ea-a714-40e5-9e10-080aef32237b","Type":"ContainerDied","Data":"3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431450 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" event={"ID":"2c4ca8ea-a714-40e5-9e10-080aef32237b","Type":"ContainerDied","Data":"6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431469 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" event={"ID":"2c4ca8ea-a714-40e5-9e10-080aef32237b","Type":"ContainerDied","Data":"0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431422 4760 generic.go:334] "Generic (PLEG): container finished" podID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerID="04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5" exitCode=0 Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431503 4760 generic.go:334] "Generic (PLEG): container finished" podID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerID="45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45" exitCode=143 Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431507 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" event={"ID":"2c4ca8ea-a714-40e5-9e10-080aef32237b","Type":"ContainerDied","Data":"bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431529 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" event={"ID":"2c4ca8ea-a714-40e5-9e10-080aef32237b","Type":"ContainerDied","Data":"9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431513 4760 generic.go:334] "Generic (PLEG): container finished" podID="2c4ca8ea-a714-40e5-9e10-080aef32237b" containerID="ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185" exitCode=143 Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431542 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" event={"ID":"2c4ca8ea-a714-40e5-9e10-080aef32237b","Type":"ContainerDied","Data":"04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431556 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431595 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431604 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431611 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431618 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431626 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431633 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431641 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431648 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431684 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" event={"ID":"2c4ca8ea-a714-40e5-9e10-080aef32237b","Type":"ContainerDied","Data":"45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431698 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431706 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431713 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431720 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431727 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431759 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431769 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431777 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431785 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431792 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431802 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" event={"ID":"2c4ca8ea-a714-40e5-9e10-080aef32237b","Type":"ContainerDied","Data":"ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431814 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431848 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431855 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431862 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431869 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431876 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431883 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431889 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431896 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431902 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431913 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sspvl" event={"ID":"2c4ca8ea-a714-40e5-9e10-080aef32237b","Type":"ContainerDied","Data":"c846605e3915d29b64cf72abf9f7642ff689799c187453d57ca92d5f01aeea8e"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431925 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431934 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431942 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431949 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431956 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431963 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431970 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431977 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431983 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431990 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.431529 4760 scope.go:117] "RemoveContainer" containerID="3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.435854 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lvdpk_f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e/kube-multus/2.log" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.436538 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lvdpk_f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e/kube-multus/1.log" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.436684 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lvdpk" event={"ID":"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e","Type":"ContainerDied","Data":"2db52c47db3f1a41355726d96c0fc8510bc80589120da7e96b2b0af67aecea6a"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.436720 4760 generic.go:334] "Generic (PLEG): container finished" podID="f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e" containerID="2db52c47db3f1a41355726d96c0fc8510bc80589120da7e96b2b0af67aecea6a" exitCode=2 Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.436733 4760 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0b2f3cfbeb8083c685469a0d988253e1fd9c2403954dda3cc742b87225c82927"} Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.437201 4760 scope.go:117] "RemoveContainer" containerID="2db52c47db3f1a41355726d96c0fc8510bc80589120da7e96b2b0af67aecea6a" Sep 30 07:43:12 crc kubenswrapper[4760]: E0930 07:43:12.437491 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-lvdpk_openshift-multus(f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e)\"" pod="openshift-multus/multus-lvdpk" podUID="f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.458374 4760 scope.go:117] "RemoveContainer" containerID="9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.466536 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-host-run-netns\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.466656 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-run-systemd\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.466731 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-host-run-ovn-kubernetes\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.466883 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-host-cni-netd\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.467052 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6wrq\" (UniqueName: \"kubernetes.io/projected/f7889829-88ef-4d39-a409-e69a4c112178-kube-api-access-b6wrq\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.467207 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f7889829-88ef-4d39-a409-e69a4c112178-env-overrides\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.467452 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-host-kubelet\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.467620 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-systemd-units\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.467798 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f7889829-88ef-4d39-a409-e69a4c112178-ovn-node-metrics-cert\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.467962 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f7889829-88ef-4d39-a409-e69a4c112178-ovnkube-script-lib\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.468173 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-run-ovn\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.468400 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-var-lib-openvswitch\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.468600 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-log-socket\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.468720 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-host-slash\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.468858 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-host-cni-bin\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.468959 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-run-openvswitch\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.469071 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-etc-openvswitch\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.469179 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-node-log\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.469223 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f7889829-88ef-4d39-a409-e69a4c112178-ovnkube-config\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.469531 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.471270 4760 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2c4ca8ea-a714-40e5-9e10-080aef32237b-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.471300 4760 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c4ca8ea-a714-40e5-9e10-080aef32237b-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.471331 4760 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-cni-netd\") on node \"crc\" DevicePath \"\"" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.471348 4760 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.471366 4760 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-node-log\") on node \"crc\" DevicePath \"\"" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.471378 4760 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-slash\") on node \"crc\" DevicePath \"\"" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.471393 4760 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.471407 4760 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.471425 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lllfc\" (UniqueName: \"kubernetes.io/projected/2c4ca8ea-a714-40e5-9e10-080aef32237b-kube-api-access-lllfc\") on node \"crc\" DevicePath \"\"" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.471441 4760 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-systemd-units\") on node \"crc\" DevicePath \"\"" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.471453 4760 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c4ca8ea-a714-40e5-9e10-080aef32237b-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.471470 4760 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-host-cni-bin\") on node \"crc\" DevicePath \"\"" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.471483 4760 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c4ca8ea-a714-40e5-9e10-080aef32237b-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.471496 4760 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.471508 4760 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-run-systemd\") on node \"crc\" DevicePath \"\"" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.471525 4760 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c4ca8ea-a714-40e5-9e10-080aef32237b-run-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.497039 4760 scope.go:117] "RemoveContainer" containerID="6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.499839 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sspvl"] Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.504434 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sspvl"] Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.518558 4760 scope.go:117] "RemoveContainer" containerID="0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9" Sep 30 07:43:12 crc kubenswrapper[4760]: E0930 07:43:12.537185 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c4ca8ea_a714_40e5_9e10_080aef32237b.slice/crio-c846605e3915d29b64cf72abf9f7642ff689799c187453d57ca92d5f01aeea8e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c4ca8ea_a714_40e5_9e10_080aef32237b.slice\": RecentStats: unable to find data in memory cache]" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.542412 4760 scope.go:117] "RemoveContainer" containerID="bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.557868 4760 scope.go:117] "RemoveContainer" containerID="9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.571748 4760 scope.go:117] "RemoveContainer" containerID="04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.572407 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-log-socket\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.572455 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-host-slash\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.572480 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-run-openvswitch\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.572511 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-host-cni-bin\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.572542 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-etc-openvswitch\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.572555 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-log-socket\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.572564 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-node-log\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.572610 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-node-log\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.572625 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-run-openvswitch\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.572649 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f7889829-88ef-4d39-a409-e69a4c112178-ovnkube-config\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.572660 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-host-slash\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.572662 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-host-cni-bin\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.572687 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.572700 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-etc-openvswitch\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.572721 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-run-systemd\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.572750 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-host-run-ovn-kubernetes\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.572782 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-host-run-netns\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.572818 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-host-run-ovn-kubernetes\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.572822 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-host-cni-netd\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.572857 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-host-cni-netd\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.572869 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f7889829-88ef-4d39-a409-e69a4c112178-env-overrides\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.572753 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.572891 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6wrq\" (UniqueName: \"kubernetes.io/projected/f7889829-88ef-4d39-a409-e69a4c112178-kube-api-access-b6wrq\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.572918 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-host-run-netns\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.572945 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-host-kubelet\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.572976 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-systemd-units\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.572999 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f7889829-88ef-4d39-a409-e69a4c112178-ovn-node-metrics-cert\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.573035 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f7889829-88ef-4d39-a409-e69a4c112178-ovnkube-script-lib\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.573059 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-var-lib-openvswitch\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.573079 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-run-ovn\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.573138 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f7889829-88ef-4d39-a409-e69a4c112178-ovnkube-config\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.573172 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-run-systemd\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.573194 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-systemd-units\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.573240 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-run-ovn\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.573451 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-host-kubelet\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.573478 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7889829-88ef-4d39-a409-e69a4c112178-var-lib-openvswitch\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.573661 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f7889829-88ef-4d39-a409-e69a4c112178-env-overrides\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.574925 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f7889829-88ef-4d39-a409-e69a4c112178-ovnkube-script-lib\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.576977 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f7889829-88ef-4d39-a409-e69a4c112178-ovn-node-metrics-cert\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.584869 4760 scope.go:117] "RemoveContainer" containerID="45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.590426 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6wrq\" (UniqueName: \"kubernetes.io/projected/f7889829-88ef-4d39-a409-e69a4c112178-kube-api-access-b6wrq\") pod \"ovnkube-node-blxw5\" (UID: \"f7889829-88ef-4d39-a409-e69a4c112178\") " pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.598113 4760 scope.go:117] "RemoveContainer" containerID="ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.611957 4760 scope.go:117] "RemoveContainer" containerID="1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.627567 4760 scope.go:117] "RemoveContainer" containerID="3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce" Sep 30 07:43:12 crc kubenswrapper[4760]: E0930 07:43:12.627808 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce\": container with ID starting with 3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce not found: ID does not exist" containerID="3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.627837 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce"} err="failed to get container status \"3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce\": rpc error: code = NotFound desc = could not find container \"3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce\": container with ID starting with 3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.627860 4760 scope.go:117] "RemoveContainer" containerID="9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912" Sep 30 07:43:12 crc kubenswrapper[4760]: E0930 07:43:12.628194 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912\": container with ID starting with 9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912 not found: ID does not exist" containerID="9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.628337 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912"} err="failed to get container status \"9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912\": rpc error: code = NotFound desc = could not find container \"9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912\": container with ID starting with 9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.628445 4760 scope.go:117] "RemoveContainer" containerID="6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0" Sep 30 07:43:12 crc kubenswrapper[4760]: E0930 07:43:12.628851 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\": container with ID starting with 6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0 not found: ID does not exist" containerID="6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.628873 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0"} err="failed to get container status \"6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\": rpc error: code = NotFound desc = could not find container \"6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\": container with ID starting with 6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.628888 4760 scope.go:117] "RemoveContainer" containerID="0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9" Sep 30 07:43:12 crc kubenswrapper[4760]: E0930 07:43:12.629179 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\": container with ID starting with 0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9 not found: ID does not exist" containerID="0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.629228 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9"} err="failed to get container status \"0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\": rpc error: code = NotFound desc = could not find container \"0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\": container with ID starting with 0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.629259 4760 scope.go:117] "RemoveContainer" containerID="bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e" Sep 30 07:43:12 crc kubenswrapper[4760]: E0930 07:43:12.629649 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\": container with ID starting with bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e not found: ID does not exist" containerID="bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.629762 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e"} err="failed to get container status \"bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\": rpc error: code = NotFound desc = could not find container \"bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\": container with ID starting with bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.629873 4760 scope.go:117] "RemoveContainer" containerID="9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379" Sep 30 07:43:12 crc kubenswrapper[4760]: E0930 07:43:12.630202 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\": container with ID starting with 9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379 not found: ID does not exist" containerID="9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.630224 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379"} err="failed to get container status \"9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\": rpc error: code = NotFound desc = could not find container \"9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\": container with ID starting with 9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.630239 4760 scope.go:117] "RemoveContainer" containerID="04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5" Sep 30 07:43:12 crc kubenswrapper[4760]: E0930 07:43:12.630516 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\": container with ID starting with 04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5 not found: ID does not exist" containerID="04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.630610 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5"} err="failed to get container status \"04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\": rpc error: code = NotFound desc = could not find container \"04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\": container with ID starting with 04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.630695 4760 scope.go:117] "RemoveContainer" containerID="45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45" Sep 30 07:43:12 crc kubenswrapper[4760]: E0930 07:43:12.631036 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\": container with ID starting with 45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45 not found: ID does not exist" containerID="45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.631151 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45"} err="failed to get container status \"45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\": rpc error: code = NotFound desc = could not find container \"45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\": container with ID starting with 45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.631249 4760 scope.go:117] "RemoveContainer" containerID="ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185" Sep 30 07:43:12 crc kubenswrapper[4760]: E0930 07:43:12.631732 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\": container with ID starting with ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185 not found: ID does not exist" containerID="ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.631756 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185"} err="failed to get container status \"ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\": rpc error: code = NotFound desc = could not find container \"ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\": container with ID starting with ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.631770 4760 scope.go:117] "RemoveContainer" containerID="1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535" Sep 30 07:43:12 crc kubenswrapper[4760]: E0930 07:43:12.632282 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\": container with ID starting with 1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535 not found: ID does not exist" containerID="1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.632429 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535"} err="failed to get container status \"1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\": rpc error: code = NotFound desc = could not find container \"1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\": container with ID starting with 1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.632563 4760 scope.go:117] "RemoveContainer" containerID="3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.632939 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce"} err="failed to get container status \"3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce\": rpc error: code = NotFound desc = could not find container \"3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce\": container with ID starting with 3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.632957 4760 scope.go:117] "RemoveContainer" containerID="9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.633274 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912"} err="failed to get container status \"9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912\": rpc error: code = NotFound desc = could not find container \"9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912\": container with ID starting with 9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.633333 4760 scope.go:117] "RemoveContainer" containerID="6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.633626 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0"} err="failed to get container status \"6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\": rpc error: code = NotFound desc = could not find container \"6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\": container with ID starting with 6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.633668 4760 scope.go:117] "RemoveContainer" containerID="0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.633922 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9"} err="failed to get container status \"0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\": rpc error: code = NotFound desc = could not find container \"0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\": container with ID starting with 0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.633972 4760 scope.go:117] "RemoveContainer" containerID="bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.634277 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e"} err="failed to get container status \"bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\": rpc error: code = NotFound desc = could not find container \"bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\": container with ID starting with bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.634312 4760 scope.go:117] "RemoveContainer" containerID="9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.634614 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379"} err="failed to get container status \"9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\": rpc error: code = NotFound desc = could not find container \"9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\": container with ID starting with 9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.634650 4760 scope.go:117] "RemoveContainer" containerID="04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.634928 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5"} err="failed to get container status \"04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\": rpc error: code = NotFound desc = could not find container \"04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\": container with ID starting with 04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.634945 4760 scope.go:117] "RemoveContainer" containerID="45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.635177 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45"} err="failed to get container status \"45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\": rpc error: code = NotFound desc = could not find container \"45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\": container with ID starting with 45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.635229 4760 scope.go:117] "RemoveContainer" containerID="ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.635542 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185"} err="failed to get container status \"ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\": rpc error: code = NotFound desc = could not find container \"ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\": container with ID starting with ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.635555 4760 scope.go:117] "RemoveContainer" containerID="1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.635797 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535"} err="failed to get container status \"1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\": rpc error: code = NotFound desc = could not find container \"1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\": container with ID starting with 1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.635814 4760 scope.go:117] "RemoveContainer" containerID="3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.636042 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce"} err="failed to get container status \"3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce\": rpc error: code = NotFound desc = could not find container \"3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce\": container with ID starting with 3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.636071 4760 scope.go:117] "RemoveContainer" containerID="9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.636403 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912"} err="failed to get container status \"9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912\": rpc error: code = NotFound desc = could not find container \"9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912\": container with ID starting with 9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.636437 4760 scope.go:117] "RemoveContainer" containerID="6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.637018 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0"} err="failed to get container status \"6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\": rpc error: code = NotFound desc = could not find container \"6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\": container with ID starting with 6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.637083 4760 scope.go:117] "RemoveContainer" containerID="0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.637435 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9"} err="failed to get container status \"0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\": rpc error: code = NotFound desc = could not find container \"0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\": container with ID starting with 0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.637461 4760 scope.go:117] "RemoveContainer" containerID="bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.637808 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e"} err="failed to get container status \"bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\": rpc error: code = NotFound desc = could not find container \"bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\": container with ID starting with bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.637837 4760 scope.go:117] "RemoveContainer" containerID="9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.638126 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379"} err="failed to get container status \"9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\": rpc error: code = NotFound desc = could not find container \"9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\": container with ID starting with 9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.638149 4760 scope.go:117] "RemoveContainer" containerID="04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.638499 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5"} err="failed to get container status \"04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\": rpc error: code = NotFound desc = could not find container \"04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\": container with ID starting with 04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.638542 4760 scope.go:117] "RemoveContainer" containerID="45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.638978 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45"} err="failed to get container status \"45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\": rpc error: code = NotFound desc = could not find container \"45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\": container with ID starting with 45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.639016 4760 scope.go:117] "RemoveContainer" containerID="ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.639358 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185"} err="failed to get container status \"ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\": rpc error: code = NotFound desc = could not find container \"ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\": container with ID starting with ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.639379 4760 scope.go:117] "RemoveContainer" containerID="1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.639656 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535"} err="failed to get container status \"1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\": rpc error: code = NotFound desc = could not find container \"1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\": container with ID starting with 1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.639694 4760 scope.go:117] "RemoveContainer" containerID="3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.640030 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce"} err="failed to get container status \"3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce\": rpc error: code = NotFound desc = could not find container \"3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce\": container with ID starting with 3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.640051 4760 scope.go:117] "RemoveContainer" containerID="9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.640353 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912"} err="failed to get container status \"9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912\": rpc error: code = NotFound desc = could not find container \"9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912\": container with ID starting with 9003b7a7d50bae8b7464fd7cbf7f18dcc7a60af1c5297a18e2f9f8482d161912 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.640380 4760 scope.go:117] "RemoveContainer" containerID="6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.640712 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0"} err="failed to get container status \"6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\": rpc error: code = NotFound desc = could not find container \"6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0\": container with ID starting with 6e72f37942962598d925dfb305311c94c3ca920f93243bb6e97839add4a830a0 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.640729 4760 scope.go:117] "RemoveContainer" containerID="0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.640978 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9"} err="failed to get container status \"0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\": rpc error: code = NotFound desc = could not find container \"0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9\": container with ID starting with 0b6c1fe1d21cc0d118e36d7f9d62fb639a19a42cd8c953700bcd5c289088cdf9 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.640994 4760 scope.go:117] "RemoveContainer" containerID="bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.641258 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e"} err="failed to get container status \"bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\": rpc error: code = NotFound desc = could not find container \"bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e\": container with ID starting with bba81e98fd1a75081b7e6a3eaf28b88b4de6e47e1dc158ca72aa2f1c2107933e not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.641277 4760 scope.go:117] "RemoveContainer" containerID="9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.641565 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379"} err="failed to get container status \"9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\": rpc error: code = NotFound desc = could not find container \"9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379\": container with ID starting with 9e64e8566880e723a6cc19ad788bd1cbb02eb639397aea4acd6e9920b034f379 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.641593 4760 scope.go:117] "RemoveContainer" containerID="04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.641833 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5"} err="failed to get container status \"04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\": rpc error: code = NotFound desc = could not find container \"04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5\": container with ID starting with 04cb9ff2859a579a0bd97d932177b8c19b786666d0177669e93ccec38abf08d5 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.641852 4760 scope.go:117] "RemoveContainer" containerID="45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.642172 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45"} err="failed to get container status \"45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\": rpc error: code = NotFound desc = could not find container \"45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45\": container with ID starting with 45d8c685d7b690ce81553e4d924934cf761087dd535145c0e8b5a933f2d61e45 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.642190 4760 scope.go:117] "RemoveContainer" containerID="ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.642465 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185"} err="failed to get container status \"ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\": rpc error: code = NotFound desc = could not find container \"ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185\": container with ID starting with ef1f6b6ca1a2bbb2f9a166a7244c6411470be4655bc6b2a330fd604c3449d185 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.642482 4760 scope.go:117] "RemoveContainer" containerID="1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.642736 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535"} err="failed to get container status \"1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\": rpc error: code = NotFound desc = could not find container \"1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535\": container with ID starting with 1198d13e2814e4dc38c4411d751fba40440a5967d7ca04c855e34fe90f9a6535 not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.642753 4760 scope.go:117] "RemoveContainer" containerID="3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.642988 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce"} err="failed to get container status \"3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce\": rpc error: code = NotFound desc = could not find container \"3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce\": container with ID starting with 3d9205e4d676fa6a90c4cef893077a6bc91d50564b15f3b221e5827c2b3025ce not found: ID does not exist" Sep 30 07:43:12 crc kubenswrapper[4760]: I0930 07:43:12.711281 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:13 crc kubenswrapper[4760]: I0930 07:43:13.078497 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c4ca8ea-a714-40e5-9e10-080aef32237b" path="/var/lib/kubelet/pods/2c4ca8ea-a714-40e5-9e10-080aef32237b/volumes" Sep 30 07:43:13 crc kubenswrapper[4760]: I0930 07:43:13.446645 4760 generic.go:334] "Generic (PLEG): container finished" podID="f7889829-88ef-4d39-a409-e69a4c112178" containerID="3166f9a1de567b2755d341d17323d33651ac1151fe149d9a65f424513322b5ee" exitCode=0 Sep 30 07:43:13 crc kubenswrapper[4760]: I0930 07:43:13.446717 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" event={"ID":"f7889829-88ef-4d39-a409-e69a4c112178","Type":"ContainerDied","Data":"3166f9a1de567b2755d341d17323d33651ac1151fe149d9a65f424513322b5ee"} Sep 30 07:43:13 crc kubenswrapper[4760]: I0930 07:43:13.446753 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" event={"ID":"f7889829-88ef-4d39-a409-e69a4c112178","Type":"ContainerStarted","Data":"5b943ba551ac5f5a9e6c23651e961f53e466aa92a608223066b1009bf7aa37e7"} Sep 30 07:43:14 crc kubenswrapper[4760]: I0930 07:43:14.465040 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" event={"ID":"f7889829-88ef-4d39-a409-e69a4c112178","Type":"ContainerStarted","Data":"b2f992a31eb898343888a53547306180928548022d9383459033be76b98cf6c8"} Sep 30 07:43:14 crc kubenswrapper[4760]: I0930 07:43:14.465519 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" event={"ID":"f7889829-88ef-4d39-a409-e69a4c112178","Type":"ContainerStarted","Data":"427303b49f6addcbe7a1d59c332b4da3b08890f996f711d22fb378f870cab33d"} Sep 30 07:43:14 crc kubenswrapper[4760]: I0930 07:43:14.465543 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" event={"ID":"f7889829-88ef-4d39-a409-e69a4c112178","Type":"ContainerStarted","Data":"aea51b81090ed39bf32a13f64d578aa4b1dcb8a149a3b39f8c5d4e88a5ca8641"} Sep 30 07:43:14 crc kubenswrapper[4760]: I0930 07:43:14.465562 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" event={"ID":"f7889829-88ef-4d39-a409-e69a4c112178","Type":"ContainerStarted","Data":"b48c1a6b0792adb5ae02be053dd51ffe3424f14258534a093c11313cefed8d64"} Sep 30 07:43:14 crc kubenswrapper[4760]: I0930 07:43:14.465579 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" event={"ID":"f7889829-88ef-4d39-a409-e69a4c112178","Type":"ContainerStarted","Data":"10f583e1e3bb3c41509b364a0c3f5f60f28e8b6626c8ec90cc8ef5d1bde5d895"} Sep 30 07:43:14 crc kubenswrapper[4760]: I0930 07:43:14.465596 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" event={"ID":"f7889829-88ef-4d39-a409-e69a4c112178","Type":"ContainerStarted","Data":"5ec4e7a4ba2a8fb9ccf2dd6cb62814bc275aca5a01ac3f3061c4298e492d41d4"} Sep 30 07:43:17 crc kubenswrapper[4760]: I0930 07:43:17.490899 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" event={"ID":"f7889829-88ef-4d39-a409-e69a4c112178","Type":"ContainerStarted","Data":"68840c8e64687f81190c15d10915a63ea0a1d9bc1dc52ac7776b201db23bf680"} Sep 30 07:43:19 crc kubenswrapper[4760]: I0930 07:43:19.113624 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:43:19 crc kubenswrapper[4760]: I0930 07:43:19.114140 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:43:19 crc kubenswrapper[4760]: I0930 07:43:19.114224 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 07:43:19 crc kubenswrapper[4760]: I0930 07:43:19.115243 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c14c5e22dc0508a193bdba7225efcdfbf417d8b9976aacad55c5d22c10bc7a92"} pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 07:43:19 crc kubenswrapper[4760]: I0930 07:43:19.115422 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" containerID="cri-o://c14c5e22dc0508a193bdba7225efcdfbf417d8b9976aacad55c5d22c10bc7a92" gracePeriod=600 Sep 30 07:43:19 crc kubenswrapper[4760]: I0930 07:43:19.503860 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" event={"ID":"f7889829-88ef-4d39-a409-e69a4c112178","Type":"ContainerStarted","Data":"ccfe322f96dd5bd63e0ab16c403547beafa280577499754450358a7d34fa1a52"} Sep 30 07:43:19 crc kubenswrapper[4760]: I0930 07:43:19.504032 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:19 crc kubenswrapper[4760]: I0930 07:43:19.504049 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:19 crc kubenswrapper[4760]: I0930 07:43:19.504059 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:19 crc kubenswrapper[4760]: I0930 07:43:19.506903 4760 generic.go:334] "Generic (PLEG): container finished" podID="7a9c8270-6964-4886-87d0-227b05b76da4" containerID="c14c5e22dc0508a193bdba7225efcdfbf417d8b9976aacad55c5d22c10bc7a92" exitCode=0 Sep 30 07:43:19 crc kubenswrapper[4760]: I0930 07:43:19.506956 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerDied","Data":"c14c5e22dc0508a193bdba7225efcdfbf417d8b9976aacad55c5d22c10bc7a92"} Sep 30 07:43:19 crc kubenswrapper[4760]: I0930 07:43:19.507028 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerStarted","Data":"2ee85f916ed74821bb70e759d7116d1ced5e1cd63215791b862f6d48359d7b6c"} Sep 30 07:43:19 crc kubenswrapper[4760]: I0930 07:43:19.507063 4760 scope.go:117] "RemoveContainer" containerID="316186f24b90a3f80ce14b2c1f47627d59bb457c83e7ed3a00cd62894b2b866d" Sep 30 07:43:19 crc kubenswrapper[4760]: I0930 07:43:19.529972 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:19 crc kubenswrapper[4760]: I0930 07:43:19.532837 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:19 crc kubenswrapper[4760]: I0930 07:43:19.546749 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" podStartSLOduration=7.54673208 podStartE2EDuration="7.54673208s" podCreationTimestamp="2025-09-30 07:43:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:43:19.544695538 +0000 UTC m=+585.187601950" watchObservedRunningTime="2025-09-30 07:43:19.54673208 +0000 UTC m=+585.189638502" Sep 30 07:43:27 crc kubenswrapper[4760]: I0930 07:43:27.067410 4760 scope.go:117] "RemoveContainer" containerID="2db52c47db3f1a41355726d96c0fc8510bc80589120da7e96b2b0af67aecea6a" Sep 30 07:43:27 crc kubenswrapper[4760]: E0930 07:43:27.068234 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-lvdpk_openshift-multus(f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e)\"" pod="openshift-multus/multus-lvdpk" podUID="f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e" Sep 30 07:43:35 crc kubenswrapper[4760]: I0930 07:43:35.407720 4760 scope.go:117] "RemoveContainer" containerID="0b2f3cfbeb8083c685469a0d988253e1fd9c2403954dda3cc742b87225c82927" Sep 30 07:43:35 crc kubenswrapper[4760]: I0930 07:43:35.616569 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lvdpk_f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e/kube-multus/2.log" Sep 30 07:43:39 crc kubenswrapper[4760]: I0930 07:43:39.583568 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f"] Sep 30 07:43:39 crc kubenswrapper[4760]: I0930 07:43:39.586703 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" Sep 30 07:43:39 crc kubenswrapper[4760]: I0930 07:43:39.589646 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 07:43:39 crc kubenswrapper[4760]: I0930 07:43:39.608042 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f"] Sep 30 07:43:39 crc kubenswrapper[4760]: I0930 07:43:39.658529 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32cb0fa2-d830-4589-8379-418cf93913d5-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f\" (UID: \"32cb0fa2-d830-4589-8379-418cf93913d5\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" Sep 30 07:43:39 crc kubenswrapper[4760]: I0930 07:43:39.658623 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32cb0fa2-d830-4589-8379-418cf93913d5-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f\" (UID: \"32cb0fa2-d830-4589-8379-418cf93913d5\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" Sep 30 07:43:39 crc kubenswrapper[4760]: I0930 07:43:39.658768 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grnn8\" (UniqueName: \"kubernetes.io/projected/32cb0fa2-d830-4589-8379-418cf93913d5-kube-api-access-grnn8\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f\" (UID: \"32cb0fa2-d830-4589-8379-418cf93913d5\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" Sep 30 07:43:39 crc kubenswrapper[4760]: I0930 07:43:39.760198 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32cb0fa2-d830-4589-8379-418cf93913d5-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f\" (UID: \"32cb0fa2-d830-4589-8379-418cf93913d5\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" Sep 30 07:43:39 crc kubenswrapper[4760]: I0930 07:43:39.760658 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32cb0fa2-d830-4589-8379-418cf93913d5-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f\" (UID: \"32cb0fa2-d830-4589-8379-418cf93913d5\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" Sep 30 07:43:39 crc kubenswrapper[4760]: I0930 07:43:39.760988 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grnn8\" (UniqueName: \"kubernetes.io/projected/32cb0fa2-d830-4589-8379-418cf93913d5-kube-api-access-grnn8\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f\" (UID: \"32cb0fa2-d830-4589-8379-418cf93913d5\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" Sep 30 07:43:39 crc kubenswrapper[4760]: I0930 07:43:39.761169 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32cb0fa2-d830-4589-8379-418cf93913d5-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f\" (UID: \"32cb0fa2-d830-4589-8379-418cf93913d5\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" Sep 30 07:43:39 crc kubenswrapper[4760]: I0930 07:43:39.761455 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32cb0fa2-d830-4589-8379-418cf93913d5-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f\" (UID: \"32cb0fa2-d830-4589-8379-418cf93913d5\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" Sep 30 07:43:39 crc kubenswrapper[4760]: I0930 07:43:39.795039 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grnn8\" (UniqueName: \"kubernetes.io/projected/32cb0fa2-d830-4589-8379-418cf93913d5-kube-api-access-grnn8\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f\" (UID: \"32cb0fa2-d830-4589-8379-418cf93913d5\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" Sep 30 07:43:39 crc kubenswrapper[4760]: I0930 07:43:39.919070 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" Sep 30 07:43:39 crc kubenswrapper[4760]: E0930 07:43:39.955376 4760 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f_openshift-marketplace_32cb0fa2-d830-4589-8379-418cf93913d5_0(705949cc3c492a13501ad9a767e96d09da7e2149316057d7a4f258130d7f00f7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 07:43:39 crc kubenswrapper[4760]: E0930 07:43:39.955529 4760 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f_openshift-marketplace_32cb0fa2-d830-4589-8379-418cf93913d5_0(705949cc3c492a13501ad9a767e96d09da7e2149316057d7a4f258130d7f00f7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" Sep 30 07:43:39 crc kubenswrapper[4760]: E0930 07:43:39.955572 4760 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f_openshift-marketplace_32cb0fa2-d830-4589-8379-418cf93913d5_0(705949cc3c492a13501ad9a767e96d09da7e2149316057d7a4f258130d7f00f7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" Sep 30 07:43:39 crc kubenswrapper[4760]: E0930 07:43:39.955711 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f_openshift-marketplace(32cb0fa2-d830-4589-8379-418cf93913d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f_openshift-marketplace(32cb0fa2-d830-4589-8379-418cf93913d5)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f_openshift-marketplace_32cb0fa2-d830-4589-8379-418cf93913d5_0(705949cc3c492a13501ad9a767e96d09da7e2149316057d7a4f258130d7f00f7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" podUID="32cb0fa2-d830-4589-8379-418cf93913d5" Sep 30 07:43:40 crc kubenswrapper[4760]: I0930 07:43:40.648580 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" Sep 30 07:43:40 crc kubenswrapper[4760]: I0930 07:43:40.649365 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" Sep 30 07:43:40 crc kubenswrapper[4760]: E0930 07:43:40.686508 4760 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f_openshift-marketplace_32cb0fa2-d830-4589-8379-418cf93913d5_0(17ce0626442516e7f7fe28dcf14bc35cc8965410b9a76da7baaabf2b19b3337b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 07:43:40 crc kubenswrapper[4760]: E0930 07:43:40.686595 4760 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f_openshift-marketplace_32cb0fa2-d830-4589-8379-418cf93913d5_0(17ce0626442516e7f7fe28dcf14bc35cc8965410b9a76da7baaabf2b19b3337b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" Sep 30 07:43:40 crc kubenswrapper[4760]: E0930 07:43:40.686626 4760 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f_openshift-marketplace_32cb0fa2-d830-4589-8379-418cf93913d5_0(17ce0626442516e7f7fe28dcf14bc35cc8965410b9a76da7baaabf2b19b3337b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" Sep 30 07:43:40 crc kubenswrapper[4760]: E0930 07:43:40.686696 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f_openshift-marketplace(32cb0fa2-d830-4589-8379-418cf93913d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f_openshift-marketplace(32cb0fa2-d830-4589-8379-418cf93913d5)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f_openshift-marketplace_32cb0fa2-d830-4589-8379-418cf93913d5_0(17ce0626442516e7f7fe28dcf14bc35cc8965410b9a76da7baaabf2b19b3337b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" podUID="32cb0fa2-d830-4589-8379-418cf93913d5" Sep 30 07:43:42 crc kubenswrapper[4760]: I0930 07:43:42.066884 4760 scope.go:117] "RemoveContainer" containerID="2db52c47db3f1a41355726d96c0fc8510bc80589120da7e96b2b0af67aecea6a" Sep 30 07:43:42 crc kubenswrapper[4760]: I0930 07:43:42.664291 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lvdpk_f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e/kube-multus/2.log" Sep 30 07:43:42 crc kubenswrapper[4760]: I0930 07:43:42.664405 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lvdpk" event={"ID":"f50c364e-d22c-4fe5-a0aa-66f4e8d8b21e","Type":"ContainerStarted","Data":"931e04e4d3d5face4b3a902855e88f19f619a084be2dbf145d29b87bcf208599"} Sep 30 07:43:42 crc kubenswrapper[4760]: I0930 07:43:42.746360 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-blxw5" Sep 30 07:43:55 crc kubenswrapper[4760]: I0930 07:43:55.066760 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" Sep 30 07:43:55 crc kubenswrapper[4760]: I0930 07:43:55.071951 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" Sep 30 07:43:55 crc kubenswrapper[4760]: I0930 07:43:55.352546 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f"] Sep 30 07:43:55 crc kubenswrapper[4760]: I0930 07:43:55.743688 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" event={"ID":"32cb0fa2-d830-4589-8379-418cf93913d5","Type":"ContainerStarted","Data":"57c3b4aa762c6f342a6898a41e2c9c367763a4fe0f13f52136a47b89685844ea"} Sep 30 07:43:55 crc kubenswrapper[4760]: I0930 07:43:55.743757 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" event={"ID":"32cb0fa2-d830-4589-8379-418cf93913d5","Type":"ContainerStarted","Data":"832f1c45626f63ca93d038f0c17f173673fe547fda06d46697838ecff9226047"} Sep 30 07:43:56 crc kubenswrapper[4760]: I0930 07:43:56.751250 4760 generic.go:334] "Generic (PLEG): container finished" podID="32cb0fa2-d830-4589-8379-418cf93913d5" containerID="57c3b4aa762c6f342a6898a41e2c9c367763a4fe0f13f52136a47b89685844ea" exitCode=0 Sep 30 07:43:56 crc kubenswrapper[4760]: I0930 07:43:56.751353 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" event={"ID":"32cb0fa2-d830-4589-8379-418cf93913d5","Type":"ContainerDied","Data":"57c3b4aa762c6f342a6898a41e2c9c367763a4fe0f13f52136a47b89685844ea"} Sep 30 07:43:58 crc kubenswrapper[4760]: I0930 07:43:58.768667 4760 generic.go:334] "Generic (PLEG): container finished" podID="32cb0fa2-d830-4589-8379-418cf93913d5" containerID="b4c229e285a48115db724d6ae2d2f8ed5e0b7089cc20bfed13f879226f126beb" exitCode=0 Sep 30 07:43:58 crc kubenswrapper[4760]: I0930 07:43:58.768827 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" event={"ID":"32cb0fa2-d830-4589-8379-418cf93913d5","Type":"ContainerDied","Data":"b4c229e285a48115db724d6ae2d2f8ed5e0b7089cc20bfed13f879226f126beb"} Sep 30 07:43:59 crc kubenswrapper[4760]: I0930 07:43:59.784212 4760 generic.go:334] "Generic (PLEG): container finished" podID="32cb0fa2-d830-4589-8379-418cf93913d5" containerID="e9fb7bc59f8a49231518d84af4d951f66c2bf6bbe5977681a9fb697d2b3fdbb9" exitCode=0 Sep 30 07:43:59 crc kubenswrapper[4760]: I0930 07:43:59.784343 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" event={"ID":"32cb0fa2-d830-4589-8379-418cf93913d5","Type":"ContainerDied","Data":"e9fb7bc59f8a49231518d84af4d951f66c2bf6bbe5977681a9fb697d2b3fdbb9"} Sep 30 07:44:01 crc kubenswrapper[4760]: I0930 07:44:01.113751 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" Sep 30 07:44:01 crc kubenswrapper[4760]: I0930 07:44:01.262648 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32cb0fa2-d830-4589-8379-418cf93913d5-bundle\") pod \"32cb0fa2-d830-4589-8379-418cf93913d5\" (UID: \"32cb0fa2-d830-4589-8379-418cf93913d5\") " Sep 30 07:44:01 crc kubenswrapper[4760]: I0930 07:44:01.263118 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32cb0fa2-d830-4589-8379-418cf93913d5-util\") pod \"32cb0fa2-d830-4589-8379-418cf93913d5\" (UID: \"32cb0fa2-d830-4589-8379-418cf93913d5\") " Sep 30 07:44:01 crc kubenswrapper[4760]: I0930 07:44:01.263223 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grnn8\" (UniqueName: \"kubernetes.io/projected/32cb0fa2-d830-4589-8379-418cf93913d5-kube-api-access-grnn8\") pod \"32cb0fa2-d830-4589-8379-418cf93913d5\" (UID: \"32cb0fa2-d830-4589-8379-418cf93913d5\") " Sep 30 07:44:01 crc kubenswrapper[4760]: I0930 07:44:01.265281 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32cb0fa2-d830-4589-8379-418cf93913d5-bundle" (OuterVolumeSpecName: "bundle") pod "32cb0fa2-d830-4589-8379-418cf93913d5" (UID: "32cb0fa2-d830-4589-8379-418cf93913d5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:44:01 crc kubenswrapper[4760]: I0930 07:44:01.280718 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32cb0fa2-d830-4589-8379-418cf93913d5-kube-api-access-grnn8" (OuterVolumeSpecName: "kube-api-access-grnn8") pod "32cb0fa2-d830-4589-8379-418cf93913d5" (UID: "32cb0fa2-d830-4589-8379-418cf93913d5"). InnerVolumeSpecName "kube-api-access-grnn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:44:01 crc kubenswrapper[4760]: I0930 07:44:01.289663 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32cb0fa2-d830-4589-8379-418cf93913d5-util" (OuterVolumeSpecName: "util") pod "32cb0fa2-d830-4589-8379-418cf93913d5" (UID: "32cb0fa2-d830-4589-8379-418cf93913d5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:44:01 crc kubenswrapper[4760]: I0930 07:44:01.364878 4760 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32cb0fa2-d830-4589-8379-418cf93913d5-util\") on node \"crc\" DevicePath \"\"" Sep 30 07:44:01 crc kubenswrapper[4760]: I0930 07:44:01.364927 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grnn8\" (UniqueName: \"kubernetes.io/projected/32cb0fa2-d830-4589-8379-418cf93913d5-kube-api-access-grnn8\") on node \"crc\" DevicePath \"\"" Sep 30 07:44:01 crc kubenswrapper[4760]: I0930 07:44:01.364942 4760 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32cb0fa2-d830-4589-8379-418cf93913d5-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:44:01 crc kubenswrapper[4760]: I0930 07:44:01.799881 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" event={"ID":"32cb0fa2-d830-4589-8379-418cf93913d5","Type":"ContainerDied","Data":"832f1c45626f63ca93d038f0c17f173673fe547fda06d46697838ecff9226047"} Sep 30 07:44:01 crc kubenswrapper[4760]: I0930 07:44:01.799972 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="832f1c45626f63ca93d038f0c17f173673fe547fda06d46697838ecff9226047" Sep 30 07:44:01 crc kubenswrapper[4760]: I0930 07:44:01.800058 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.114699 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-gvl7c"] Sep 30 07:44:12 crc kubenswrapper[4760]: E0930 07:44:12.115455 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32cb0fa2-d830-4589-8379-418cf93913d5" containerName="util" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.115470 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="32cb0fa2-d830-4589-8379-418cf93913d5" containerName="util" Sep 30 07:44:12 crc kubenswrapper[4760]: E0930 07:44:12.115490 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32cb0fa2-d830-4589-8379-418cf93913d5" containerName="extract" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.115496 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="32cb0fa2-d830-4589-8379-418cf93913d5" containerName="extract" Sep 30 07:44:12 crc kubenswrapper[4760]: E0930 07:44:12.115574 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32cb0fa2-d830-4589-8379-418cf93913d5" containerName="pull" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.115580 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="32cb0fa2-d830-4589-8379-418cf93913d5" containerName="pull" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.115691 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="32cb0fa2-d830-4589-8379-418cf93913d5" containerName="extract" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.116067 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-gvl7c" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.118260 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.118391 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.118812 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-h7m6n" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.132218 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-gvl7c"] Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.242375 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-k9fmb"] Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.242997 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-k9fmb" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.244562 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-97kct" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.244629 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.271857 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-g78wr"] Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.272702 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-g78wr" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.275554 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-k9fmb"] Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.286191 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-g78wr"] Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.291810 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l266\" (UniqueName: \"kubernetes.io/projected/c35951af-e973-4663-9db5-2c5ac164bbba-kube-api-access-5l266\") pod \"obo-prometheus-operator-7c8cf85677-gvl7c\" (UID: \"c35951af-e973-4663-9db5-2c5ac164bbba\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-gvl7c" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.392684 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l266\" (UniqueName: \"kubernetes.io/projected/c35951af-e973-4663-9db5-2c5ac164bbba-kube-api-access-5l266\") pod \"obo-prometheus-operator-7c8cf85677-gvl7c\" (UID: \"c35951af-e973-4663-9db5-2c5ac164bbba\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-gvl7c" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.392757 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/790a604d-1726-4fc9-8e29-e30af2f26616-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6899d445c8-k9fmb\" (UID: \"790a604d-1726-4fc9-8e29-e30af2f26616\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-k9fmb" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.392786 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/790a604d-1726-4fc9-8e29-e30af2f26616-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6899d445c8-k9fmb\" (UID: \"790a604d-1726-4fc9-8e29-e30af2f26616\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-k9fmb" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.392832 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b7b18a96-fb82-48a3-a34e-ebea9ef3eb75-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6899d445c8-g78wr\" (UID: \"b7b18a96-fb82-48a3-a34e-ebea9ef3eb75\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-g78wr" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.392856 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b7b18a96-fb82-48a3-a34e-ebea9ef3eb75-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6899d445c8-g78wr\" (UID: \"b7b18a96-fb82-48a3-a34e-ebea9ef3eb75\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-g78wr" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.422080 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l266\" (UniqueName: \"kubernetes.io/projected/c35951af-e973-4663-9db5-2c5ac164bbba-kube-api-access-5l266\") pod \"obo-prometheus-operator-7c8cf85677-gvl7c\" (UID: \"c35951af-e973-4663-9db5-2c5ac164bbba\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-gvl7c" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.437021 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-gvl7c" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.467665 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-cpxd4"] Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.468441 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-cpxd4" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.473159 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.473408 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-lzmsn" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.483070 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-cpxd4"] Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.494329 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b7b18a96-fb82-48a3-a34e-ebea9ef3eb75-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6899d445c8-g78wr\" (UID: \"b7b18a96-fb82-48a3-a34e-ebea9ef3eb75\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-g78wr" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.494433 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/790a604d-1726-4fc9-8e29-e30af2f26616-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6899d445c8-k9fmb\" (UID: \"790a604d-1726-4fc9-8e29-e30af2f26616\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-k9fmb" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.494467 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/790a604d-1726-4fc9-8e29-e30af2f26616-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6899d445c8-k9fmb\" (UID: \"790a604d-1726-4fc9-8e29-e30af2f26616\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-k9fmb" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.494489 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55r9l\" (UniqueName: \"kubernetes.io/projected/04a90715-31eb-49fb-9682-0a211630eede-kube-api-access-55r9l\") pod \"observability-operator-cc5f78dfc-cpxd4\" (UID: \"04a90715-31eb-49fb-9682-0a211630eede\") " pod="openshift-operators/observability-operator-cc5f78dfc-cpxd4" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.494521 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/04a90715-31eb-49fb-9682-0a211630eede-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-cpxd4\" (UID: \"04a90715-31eb-49fb-9682-0a211630eede\") " pod="openshift-operators/observability-operator-cc5f78dfc-cpxd4" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.494546 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b7b18a96-fb82-48a3-a34e-ebea9ef3eb75-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6899d445c8-g78wr\" (UID: \"b7b18a96-fb82-48a3-a34e-ebea9ef3eb75\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-g78wr" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.498973 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b7b18a96-fb82-48a3-a34e-ebea9ef3eb75-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6899d445c8-g78wr\" (UID: \"b7b18a96-fb82-48a3-a34e-ebea9ef3eb75\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-g78wr" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.499580 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/790a604d-1726-4fc9-8e29-e30af2f26616-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6899d445c8-k9fmb\" (UID: \"790a604d-1726-4fc9-8e29-e30af2f26616\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-k9fmb" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.499589 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/790a604d-1726-4fc9-8e29-e30af2f26616-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6899d445c8-k9fmb\" (UID: \"790a604d-1726-4fc9-8e29-e30af2f26616\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-k9fmb" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.503637 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b7b18a96-fb82-48a3-a34e-ebea9ef3eb75-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6899d445c8-g78wr\" (UID: \"b7b18a96-fb82-48a3-a34e-ebea9ef3eb75\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-g78wr" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.563291 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-k9fmb" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.588198 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-g78wr" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.595832 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55r9l\" (UniqueName: \"kubernetes.io/projected/04a90715-31eb-49fb-9682-0a211630eede-kube-api-access-55r9l\") pod \"observability-operator-cc5f78dfc-cpxd4\" (UID: \"04a90715-31eb-49fb-9682-0a211630eede\") " pod="openshift-operators/observability-operator-cc5f78dfc-cpxd4" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.595969 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/04a90715-31eb-49fb-9682-0a211630eede-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-cpxd4\" (UID: \"04a90715-31eb-49fb-9682-0a211630eede\") " pod="openshift-operators/observability-operator-cc5f78dfc-cpxd4" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.601832 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/04a90715-31eb-49fb-9682-0a211630eede-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-cpxd4\" (UID: \"04a90715-31eb-49fb-9682-0a211630eede\") " pod="openshift-operators/observability-operator-cc5f78dfc-cpxd4" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.615421 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55r9l\" (UniqueName: \"kubernetes.io/projected/04a90715-31eb-49fb-9682-0a211630eede-kube-api-access-55r9l\") pod \"observability-operator-cc5f78dfc-cpxd4\" (UID: \"04a90715-31eb-49fb-9682-0a211630eede\") " pod="openshift-operators/observability-operator-cc5f78dfc-cpxd4" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.726391 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-4pgmn"] Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.727555 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-4pgmn" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.736942 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-zzwj8" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.740613 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-4pgmn"] Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.788623 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-gvl7c"] Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.860898 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-gvl7c" event={"ID":"c35951af-e973-4663-9db5-2c5ac164bbba","Type":"ContainerStarted","Data":"19c56620ea02cad360d64c97290859598b40965cc70e9ce677a386c6bca9a60f"} Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.867528 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-cpxd4" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.901219 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7qx8\" (UniqueName: \"kubernetes.io/projected/fed44a9b-44ce-4650-b854-6c84c8536c57-kube-api-access-k7qx8\") pod \"perses-operator-54bc95c9fb-4pgmn\" (UID: \"fed44a9b-44ce-4650-b854-6c84c8536c57\") " pod="openshift-operators/perses-operator-54bc95c9fb-4pgmn" Sep 30 07:44:12 crc kubenswrapper[4760]: I0930 07:44:12.901276 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/fed44a9b-44ce-4650-b854-6c84c8536c57-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-4pgmn\" (UID: \"fed44a9b-44ce-4650-b854-6c84c8536c57\") " pod="openshift-operators/perses-operator-54bc95c9fb-4pgmn" Sep 30 07:44:13 crc kubenswrapper[4760]: I0930 07:44:13.004839 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7qx8\" (UniqueName: \"kubernetes.io/projected/fed44a9b-44ce-4650-b854-6c84c8536c57-kube-api-access-k7qx8\") pod \"perses-operator-54bc95c9fb-4pgmn\" (UID: \"fed44a9b-44ce-4650-b854-6c84c8536c57\") " pod="openshift-operators/perses-operator-54bc95c9fb-4pgmn" Sep 30 07:44:13 crc kubenswrapper[4760]: I0930 07:44:13.005172 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/fed44a9b-44ce-4650-b854-6c84c8536c57-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-4pgmn\" (UID: \"fed44a9b-44ce-4650-b854-6c84c8536c57\") " pod="openshift-operators/perses-operator-54bc95c9fb-4pgmn" Sep 30 07:44:13 crc kubenswrapper[4760]: I0930 07:44:13.006011 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/fed44a9b-44ce-4650-b854-6c84c8536c57-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-4pgmn\" (UID: \"fed44a9b-44ce-4650-b854-6c84c8536c57\") " pod="openshift-operators/perses-operator-54bc95c9fb-4pgmn" Sep 30 07:44:13 crc kubenswrapper[4760]: I0930 07:44:13.031889 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7qx8\" (UniqueName: \"kubernetes.io/projected/fed44a9b-44ce-4650-b854-6c84c8536c57-kube-api-access-k7qx8\") pod \"perses-operator-54bc95c9fb-4pgmn\" (UID: \"fed44a9b-44ce-4650-b854-6c84c8536c57\") " pod="openshift-operators/perses-operator-54bc95c9fb-4pgmn" Sep 30 07:44:13 crc kubenswrapper[4760]: I0930 07:44:13.064700 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-4pgmn" Sep 30 07:44:13 crc kubenswrapper[4760]: I0930 07:44:13.091618 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-cpxd4"] Sep 30 07:44:13 crc kubenswrapper[4760]: W0930 07:44:13.096616 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04a90715_31eb_49fb_9682_0a211630eede.slice/crio-fc82881bf806123c8d33f91d95652202c387e9fa0bfcd836a3efa2f7bef3dadd WatchSource:0}: Error finding container fc82881bf806123c8d33f91d95652202c387e9fa0bfcd836a3efa2f7bef3dadd: Status 404 returned error can't find the container with id fc82881bf806123c8d33f91d95652202c387e9fa0bfcd836a3efa2f7bef3dadd Sep 30 07:44:13 crc kubenswrapper[4760]: I0930 07:44:13.111087 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-k9fmb"] Sep 30 07:44:13 crc kubenswrapper[4760]: W0930 07:44:13.114921 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod790a604d_1726_4fc9_8e29_e30af2f26616.slice/crio-62e201cbfb0115d2471df02590ceda562ca17eb892b33279d8cb23d176000ed0 WatchSource:0}: Error finding container 62e201cbfb0115d2471df02590ceda562ca17eb892b33279d8cb23d176000ed0: Status 404 returned error can't find the container with id 62e201cbfb0115d2471df02590ceda562ca17eb892b33279d8cb23d176000ed0 Sep 30 07:44:13 crc kubenswrapper[4760]: I0930 07:44:13.185702 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-g78wr"] Sep 30 07:44:13 crc kubenswrapper[4760]: W0930 07:44:13.192847 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7b18a96_fb82_48a3_a34e_ebea9ef3eb75.slice/crio-b4cf3f588a59384fb8e70bf25a955dd255c88bd141b58e1b1b68ed12e9ce36cf WatchSource:0}: Error finding container b4cf3f588a59384fb8e70bf25a955dd255c88bd141b58e1b1b68ed12e9ce36cf: Status 404 returned error can't find the container with id b4cf3f588a59384fb8e70bf25a955dd255c88bd141b58e1b1b68ed12e9ce36cf Sep 30 07:44:13 crc kubenswrapper[4760]: I0930 07:44:13.253049 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-4pgmn"] Sep 30 07:44:13 crc kubenswrapper[4760]: W0930 07:44:13.261802 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfed44a9b_44ce_4650_b854_6c84c8536c57.slice/crio-3193b1d49b4268c0e09511b9c16d25155c849d99a0b2551b2cfa1c5a9aea45dc WatchSource:0}: Error finding container 3193b1d49b4268c0e09511b9c16d25155c849d99a0b2551b2cfa1c5a9aea45dc: Status 404 returned error can't find the container with id 3193b1d49b4268c0e09511b9c16d25155c849d99a0b2551b2cfa1c5a9aea45dc Sep 30 07:44:13 crc kubenswrapper[4760]: I0930 07:44:13.867361 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-4pgmn" event={"ID":"fed44a9b-44ce-4650-b854-6c84c8536c57","Type":"ContainerStarted","Data":"3193b1d49b4268c0e09511b9c16d25155c849d99a0b2551b2cfa1c5a9aea45dc"} Sep 30 07:44:13 crc kubenswrapper[4760]: I0930 07:44:13.869130 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-g78wr" event={"ID":"b7b18a96-fb82-48a3-a34e-ebea9ef3eb75","Type":"ContainerStarted","Data":"b4cf3f588a59384fb8e70bf25a955dd255c88bd141b58e1b1b68ed12e9ce36cf"} Sep 30 07:44:13 crc kubenswrapper[4760]: I0930 07:44:13.870215 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-cpxd4" event={"ID":"04a90715-31eb-49fb-9682-0a211630eede","Type":"ContainerStarted","Data":"fc82881bf806123c8d33f91d95652202c387e9fa0bfcd836a3efa2f7bef3dadd"} Sep 30 07:44:13 crc kubenswrapper[4760]: I0930 07:44:13.871246 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-k9fmb" event={"ID":"790a604d-1726-4fc9-8e29-e30af2f26616","Type":"ContainerStarted","Data":"62e201cbfb0115d2471df02590ceda562ca17eb892b33279d8cb23d176000ed0"} Sep 30 07:44:25 crc kubenswrapper[4760]: I0930 07:44:25.969688 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-4pgmn" event={"ID":"fed44a9b-44ce-4650-b854-6c84c8536c57","Type":"ContainerStarted","Data":"9a274d9a982cf9162850918384c93d75d7782d948849cc88fdcde5b5a879e201"} Sep 30 07:44:25 crc kubenswrapper[4760]: I0930 07:44:25.970188 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-4pgmn" Sep 30 07:44:25 crc kubenswrapper[4760]: I0930 07:44:25.971411 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-g78wr" event={"ID":"b7b18a96-fb82-48a3-a34e-ebea9ef3eb75","Type":"ContainerStarted","Data":"9e2832a5df0ab57b422ea12d266e4f8c70c4e143c6e0dd3314bc7b47fbecfb3a"} Sep 30 07:44:25 crc kubenswrapper[4760]: I0930 07:44:25.974730 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-gvl7c" event={"ID":"c35951af-e973-4663-9db5-2c5ac164bbba","Type":"ContainerStarted","Data":"09f68d614af5671188f2bc3f1952920a178171e354765779d97d076ab52c6362"} Sep 30 07:44:25 crc kubenswrapper[4760]: I0930 07:44:25.982713 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-cpxd4" event={"ID":"04a90715-31eb-49fb-9682-0a211630eede","Type":"ContainerStarted","Data":"cb0151eeff9bcd3baad11605f23e03c6defd62e8331a8eb4e3f5160a358642d4"} Sep 30 07:44:25 crc kubenswrapper[4760]: I0930 07:44:25.983818 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-cpxd4" Sep 30 07:44:25 crc kubenswrapper[4760]: I0930 07:44:25.989367 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-cpxd4" Sep 30 07:44:25 crc kubenswrapper[4760]: I0930 07:44:25.990424 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-k9fmb" event={"ID":"790a604d-1726-4fc9-8e29-e30af2f26616","Type":"ContainerStarted","Data":"11fa574b3fd36e4f62f695e53be53850a4535cec20b9a28b483061046858e61a"} Sep 30 07:44:25 crc kubenswrapper[4760]: I0930 07:44:25.994281 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-4pgmn" podStartSLOduration=2.011169066 podStartE2EDuration="13.994269036s" podCreationTimestamp="2025-09-30 07:44:12 +0000 UTC" firstStartedPulling="2025-09-30 07:44:13.263560846 +0000 UTC m=+638.906467258" lastFinishedPulling="2025-09-30 07:44:25.246660816 +0000 UTC m=+650.889567228" observedRunningTime="2025-09-30 07:44:25.993836165 +0000 UTC m=+651.636742597" watchObservedRunningTime="2025-09-30 07:44:25.994269036 +0000 UTC m=+651.637175448" Sep 30 07:44:26 crc kubenswrapper[4760]: I0930 07:44:26.016236 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-gvl7c" podStartSLOduration=1.6251927720000001 podStartE2EDuration="14.016222314s" podCreationTimestamp="2025-09-30 07:44:12 +0000 UTC" firstStartedPulling="2025-09-30 07:44:12.813367037 +0000 UTC m=+638.456273449" lastFinishedPulling="2025-09-30 07:44:25.204396579 +0000 UTC m=+650.847302991" observedRunningTime="2025-09-30 07:44:26.013813893 +0000 UTC m=+651.656720305" watchObservedRunningTime="2025-09-30 07:44:26.016222314 +0000 UTC m=+651.659128726" Sep 30 07:44:26 crc kubenswrapper[4760]: I0930 07:44:26.036327 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-g78wr" podStartSLOduration=2.033194015 podStartE2EDuration="14.036296335s" podCreationTimestamp="2025-09-30 07:44:12 +0000 UTC" firstStartedPulling="2025-09-30 07:44:13.200082049 +0000 UTC m=+638.842988461" lastFinishedPulling="2025-09-30 07:44:25.203184369 +0000 UTC m=+650.846090781" observedRunningTime="2025-09-30 07:44:26.03296335 +0000 UTC m=+651.675869782" watchObservedRunningTime="2025-09-30 07:44:26.036296335 +0000 UTC m=+651.679202747" Sep 30 07:44:26 crc kubenswrapper[4760]: I0930 07:44:26.106514 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-cpxd4" podStartSLOduration=1.9839114279999999 podStartE2EDuration="14.106493191s" podCreationTimestamp="2025-09-30 07:44:12 +0000 UTC" firstStartedPulling="2025-09-30 07:44:13.099556818 +0000 UTC m=+638.742463230" lastFinishedPulling="2025-09-30 07:44:25.222138581 +0000 UTC m=+650.865044993" observedRunningTime="2025-09-30 07:44:26.101613717 +0000 UTC m=+651.744520149" watchObservedRunningTime="2025-09-30 07:44:26.106493191 +0000 UTC m=+651.749399613" Sep 30 07:44:26 crc kubenswrapper[4760]: I0930 07:44:26.124121 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6899d445c8-k9fmb" podStartSLOduration=2.054006833 podStartE2EDuration="14.124101399s" podCreationTimestamp="2025-09-30 07:44:12 +0000 UTC" firstStartedPulling="2025-09-30 07:44:13.117885315 +0000 UTC m=+638.760791727" lastFinishedPulling="2025-09-30 07:44:25.187979881 +0000 UTC m=+650.830886293" observedRunningTime="2025-09-30 07:44:26.122005116 +0000 UTC m=+651.764911548" watchObservedRunningTime="2025-09-30 07:44:26.124101399 +0000 UTC m=+651.767007821" Sep 30 07:44:33 crc kubenswrapper[4760]: I0930 07:44:33.076823 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-4pgmn" Sep 30 07:44:50 crc kubenswrapper[4760]: I0930 07:44:50.868272 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8"] Sep 30 07:44:50 crc kubenswrapper[4760]: I0930 07:44:50.871590 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8" Sep 30 07:44:50 crc kubenswrapper[4760]: I0930 07:44:50.873266 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 07:44:50 crc kubenswrapper[4760]: I0930 07:44:50.879544 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8"] Sep 30 07:44:51 crc kubenswrapper[4760]: I0930 07:44:51.008134 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/309b7e9d-3273-4a4c-865d-9287bab3988f-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8\" (UID: \"309b7e9d-3273-4a4c-865d-9287bab3988f\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8" Sep 30 07:44:51 crc kubenswrapper[4760]: I0930 07:44:51.008247 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/309b7e9d-3273-4a4c-865d-9287bab3988f-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8\" (UID: \"309b7e9d-3273-4a4c-865d-9287bab3988f\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8" Sep 30 07:44:51 crc kubenswrapper[4760]: I0930 07:44:51.008460 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-972jn\" (UniqueName: \"kubernetes.io/projected/309b7e9d-3273-4a4c-865d-9287bab3988f-kube-api-access-972jn\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8\" (UID: \"309b7e9d-3273-4a4c-865d-9287bab3988f\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8" Sep 30 07:44:51 crc kubenswrapper[4760]: I0930 07:44:51.110150 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/309b7e9d-3273-4a4c-865d-9287bab3988f-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8\" (UID: \"309b7e9d-3273-4a4c-865d-9287bab3988f\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8" Sep 30 07:44:51 crc kubenswrapper[4760]: I0930 07:44:51.110219 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-972jn\" (UniqueName: \"kubernetes.io/projected/309b7e9d-3273-4a4c-865d-9287bab3988f-kube-api-access-972jn\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8\" (UID: \"309b7e9d-3273-4a4c-865d-9287bab3988f\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8" Sep 30 07:44:51 crc kubenswrapper[4760]: I0930 07:44:51.110320 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/309b7e9d-3273-4a4c-865d-9287bab3988f-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8\" (UID: \"309b7e9d-3273-4a4c-865d-9287bab3988f\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8" Sep 30 07:44:51 crc kubenswrapper[4760]: I0930 07:44:51.110858 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/309b7e9d-3273-4a4c-865d-9287bab3988f-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8\" (UID: \"309b7e9d-3273-4a4c-865d-9287bab3988f\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8" Sep 30 07:44:51 crc kubenswrapper[4760]: I0930 07:44:51.111342 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/309b7e9d-3273-4a4c-865d-9287bab3988f-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8\" (UID: \"309b7e9d-3273-4a4c-865d-9287bab3988f\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8" Sep 30 07:44:51 crc kubenswrapper[4760]: I0930 07:44:51.143688 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-972jn\" (UniqueName: \"kubernetes.io/projected/309b7e9d-3273-4a4c-865d-9287bab3988f-kube-api-access-972jn\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8\" (UID: \"309b7e9d-3273-4a4c-865d-9287bab3988f\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8" Sep 30 07:44:51 crc kubenswrapper[4760]: I0930 07:44:51.198564 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8" Sep 30 07:44:51 crc kubenswrapper[4760]: I0930 07:44:51.431631 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8"] Sep 30 07:44:52 crc kubenswrapper[4760]: I0930 07:44:52.158761 4760 generic.go:334] "Generic (PLEG): container finished" podID="309b7e9d-3273-4a4c-865d-9287bab3988f" containerID="840d468e8157485d8ef3a48b3f1b1aa03216f1cac178f9609ccb7ef569fdf131" exitCode=0 Sep 30 07:44:52 crc kubenswrapper[4760]: I0930 07:44:52.158829 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8" event={"ID":"309b7e9d-3273-4a4c-865d-9287bab3988f","Type":"ContainerDied","Data":"840d468e8157485d8ef3a48b3f1b1aa03216f1cac178f9609ccb7ef569fdf131"} Sep 30 07:44:52 crc kubenswrapper[4760]: I0930 07:44:52.158894 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8" event={"ID":"309b7e9d-3273-4a4c-865d-9287bab3988f","Type":"ContainerStarted","Data":"a09c02cee6c28ce864ba34db0a9104b5f2ed57372c3e985a5cd0641541d9ac52"} Sep 30 07:44:54 crc kubenswrapper[4760]: I0930 07:44:54.176411 4760 generic.go:334] "Generic (PLEG): container finished" podID="309b7e9d-3273-4a4c-865d-9287bab3988f" containerID="a8babd20315517dd5030512b9dc977b9a56d3834266db2322fb0ec18d4c4a776" exitCode=0 Sep 30 07:44:54 crc kubenswrapper[4760]: I0930 07:44:54.176499 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8" event={"ID":"309b7e9d-3273-4a4c-865d-9287bab3988f","Type":"ContainerDied","Data":"a8babd20315517dd5030512b9dc977b9a56d3834266db2322fb0ec18d4c4a776"} Sep 30 07:44:55 crc kubenswrapper[4760]: I0930 07:44:55.184471 4760 generic.go:334] "Generic (PLEG): container finished" podID="309b7e9d-3273-4a4c-865d-9287bab3988f" containerID="399d9a4016d9ea9ece8053692c75ec978203bd304a873c5f1cd76ce337fc7ceb" exitCode=0 Sep 30 07:44:55 crc kubenswrapper[4760]: I0930 07:44:55.184516 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8" event={"ID":"309b7e9d-3273-4a4c-865d-9287bab3988f","Type":"ContainerDied","Data":"399d9a4016d9ea9ece8053692c75ec978203bd304a873c5f1cd76ce337fc7ceb"} Sep 30 07:44:56 crc kubenswrapper[4760]: I0930 07:44:56.499206 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8" Sep 30 07:44:56 crc kubenswrapper[4760]: I0930 07:44:56.596097 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/309b7e9d-3273-4a4c-865d-9287bab3988f-bundle\") pod \"309b7e9d-3273-4a4c-865d-9287bab3988f\" (UID: \"309b7e9d-3273-4a4c-865d-9287bab3988f\") " Sep 30 07:44:56 crc kubenswrapper[4760]: I0930 07:44:56.596290 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/309b7e9d-3273-4a4c-865d-9287bab3988f-util\") pod \"309b7e9d-3273-4a4c-865d-9287bab3988f\" (UID: \"309b7e9d-3273-4a4c-865d-9287bab3988f\") " Sep 30 07:44:56 crc kubenswrapper[4760]: I0930 07:44:56.596346 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-972jn\" (UniqueName: \"kubernetes.io/projected/309b7e9d-3273-4a4c-865d-9287bab3988f-kube-api-access-972jn\") pod \"309b7e9d-3273-4a4c-865d-9287bab3988f\" (UID: \"309b7e9d-3273-4a4c-865d-9287bab3988f\") " Sep 30 07:44:56 crc kubenswrapper[4760]: I0930 07:44:56.596909 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/309b7e9d-3273-4a4c-865d-9287bab3988f-bundle" (OuterVolumeSpecName: "bundle") pod "309b7e9d-3273-4a4c-865d-9287bab3988f" (UID: "309b7e9d-3273-4a4c-865d-9287bab3988f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:44:56 crc kubenswrapper[4760]: I0930 07:44:56.607470 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/309b7e9d-3273-4a4c-865d-9287bab3988f-kube-api-access-972jn" (OuterVolumeSpecName: "kube-api-access-972jn") pod "309b7e9d-3273-4a4c-865d-9287bab3988f" (UID: "309b7e9d-3273-4a4c-865d-9287bab3988f"). InnerVolumeSpecName "kube-api-access-972jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:44:56 crc kubenswrapper[4760]: I0930 07:44:56.608958 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/309b7e9d-3273-4a4c-865d-9287bab3988f-util" (OuterVolumeSpecName: "util") pod "309b7e9d-3273-4a4c-865d-9287bab3988f" (UID: "309b7e9d-3273-4a4c-865d-9287bab3988f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:44:56 crc kubenswrapper[4760]: I0930 07:44:56.697743 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-972jn\" (UniqueName: \"kubernetes.io/projected/309b7e9d-3273-4a4c-865d-9287bab3988f-kube-api-access-972jn\") on node \"crc\" DevicePath \"\"" Sep 30 07:44:56 crc kubenswrapper[4760]: I0930 07:44:56.697792 4760 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/309b7e9d-3273-4a4c-865d-9287bab3988f-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:44:56 crc kubenswrapper[4760]: I0930 07:44:56.697808 4760 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/309b7e9d-3273-4a4c-865d-9287bab3988f-util\") on node \"crc\" DevicePath \"\"" Sep 30 07:44:57 crc kubenswrapper[4760]: I0930 07:44:57.200259 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8" event={"ID":"309b7e9d-3273-4a4c-865d-9287bab3988f","Type":"ContainerDied","Data":"a09c02cee6c28ce864ba34db0a9104b5f2ed57372c3e985a5cd0641541d9ac52"} Sep 30 07:44:57 crc kubenswrapper[4760]: I0930 07:44:57.200690 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a09c02cee6c28ce864ba34db0a9104b5f2ed57372c3e985a5cd0641541d9ac52" Sep 30 07:44:57 crc kubenswrapper[4760]: I0930 07:44:57.200786 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8" Sep 30 07:44:59 crc kubenswrapper[4760]: I0930 07:44:59.766684 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-g68gd"] Sep 30 07:44:59 crc kubenswrapper[4760]: E0930 07:44:59.767225 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="309b7e9d-3273-4a4c-865d-9287bab3988f" containerName="util" Sep 30 07:44:59 crc kubenswrapper[4760]: I0930 07:44:59.767240 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="309b7e9d-3273-4a4c-865d-9287bab3988f" containerName="util" Sep 30 07:44:59 crc kubenswrapper[4760]: E0930 07:44:59.767263 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="309b7e9d-3273-4a4c-865d-9287bab3988f" containerName="pull" Sep 30 07:44:59 crc kubenswrapper[4760]: I0930 07:44:59.767271 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="309b7e9d-3273-4a4c-865d-9287bab3988f" containerName="pull" Sep 30 07:44:59 crc kubenswrapper[4760]: E0930 07:44:59.767282 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="309b7e9d-3273-4a4c-865d-9287bab3988f" containerName="extract" Sep 30 07:44:59 crc kubenswrapper[4760]: I0930 07:44:59.767290 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="309b7e9d-3273-4a4c-865d-9287bab3988f" containerName="extract" Sep 30 07:44:59 crc kubenswrapper[4760]: I0930 07:44:59.767422 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="309b7e9d-3273-4a4c-865d-9287bab3988f" containerName="extract" Sep 30 07:44:59 crc kubenswrapper[4760]: I0930 07:44:59.767879 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-g68gd" Sep 30 07:44:59 crc kubenswrapper[4760]: I0930 07:44:59.771233 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-4slbj" Sep 30 07:44:59 crc kubenswrapper[4760]: I0930 07:44:59.771341 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Sep 30 07:44:59 crc kubenswrapper[4760]: I0930 07:44:59.773491 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Sep 30 07:44:59 crc kubenswrapper[4760]: I0930 07:44:59.791135 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-g68gd"] Sep 30 07:44:59 crc kubenswrapper[4760]: I0930 07:44:59.937387 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf94j\" (UniqueName: \"kubernetes.io/projected/40eb0a4f-5fde-42fa-a5c0-283ccab9a683-kube-api-access-zf94j\") pod \"nmstate-operator-5d6f6cfd66-g68gd\" (UID: \"40eb0a4f-5fde-42fa-a5c0-283ccab9a683\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-g68gd" Sep 30 07:45:00 crc kubenswrapper[4760]: I0930 07:45:00.038201 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf94j\" (UniqueName: \"kubernetes.io/projected/40eb0a4f-5fde-42fa-a5c0-283ccab9a683-kube-api-access-zf94j\") pod \"nmstate-operator-5d6f6cfd66-g68gd\" (UID: \"40eb0a4f-5fde-42fa-a5c0-283ccab9a683\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-g68gd" Sep 30 07:45:00 crc kubenswrapper[4760]: I0930 07:45:00.075383 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf94j\" (UniqueName: \"kubernetes.io/projected/40eb0a4f-5fde-42fa-a5c0-283ccab9a683-kube-api-access-zf94j\") pod \"nmstate-operator-5d6f6cfd66-g68gd\" (UID: \"40eb0a4f-5fde-42fa-a5c0-283ccab9a683\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-g68gd" Sep 30 07:45:00 crc kubenswrapper[4760]: I0930 07:45:00.084796 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-g68gd" Sep 30 07:45:00 crc kubenswrapper[4760]: I0930 07:45:00.174350 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320305-v49ds"] Sep 30 07:45:00 crc kubenswrapper[4760]: I0930 07:45:00.175117 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-v49ds" Sep 30 07:45:00 crc kubenswrapper[4760]: I0930 07:45:00.177968 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 07:45:00 crc kubenswrapper[4760]: I0930 07:45:00.178149 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 07:45:00 crc kubenswrapper[4760]: I0930 07:45:00.186861 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320305-v49ds"] Sep 30 07:45:00 crc kubenswrapper[4760]: I0930 07:45:00.342279 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b410270-7237-429f-bc0f-8d7986cef241-config-volume\") pod \"collect-profiles-29320305-v49ds\" (UID: \"1b410270-7237-429f-bc0f-8d7986cef241\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-v49ds" Sep 30 07:45:00 crc kubenswrapper[4760]: I0930 07:45:00.342588 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flkh6\" (UniqueName: \"kubernetes.io/projected/1b410270-7237-429f-bc0f-8d7986cef241-kube-api-access-flkh6\") pod \"collect-profiles-29320305-v49ds\" (UID: \"1b410270-7237-429f-bc0f-8d7986cef241\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-v49ds" Sep 30 07:45:00 crc kubenswrapper[4760]: I0930 07:45:00.342634 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b410270-7237-429f-bc0f-8d7986cef241-secret-volume\") pod \"collect-profiles-29320305-v49ds\" (UID: \"1b410270-7237-429f-bc0f-8d7986cef241\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-v49ds" Sep 30 07:45:00 crc kubenswrapper[4760]: I0930 07:45:00.443731 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flkh6\" (UniqueName: \"kubernetes.io/projected/1b410270-7237-429f-bc0f-8d7986cef241-kube-api-access-flkh6\") pod \"collect-profiles-29320305-v49ds\" (UID: \"1b410270-7237-429f-bc0f-8d7986cef241\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-v49ds" Sep 30 07:45:00 crc kubenswrapper[4760]: I0930 07:45:00.443782 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b410270-7237-429f-bc0f-8d7986cef241-secret-volume\") pod \"collect-profiles-29320305-v49ds\" (UID: \"1b410270-7237-429f-bc0f-8d7986cef241\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-v49ds" Sep 30 07:45:00 crc kubenswrapper[4760]: I0930 07:45:00.443864 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b410270-7237-429f-bc0f-8d7986cef241-config-volume\") pod \"collect-profiles-29320305-v49ds\" (UID: \"1b410270-7237-429f-bc0f-8d7986cef241\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-v49ds" Sep 30 07:45:00 crc kubenswrapper[4760]: I0930 07:45:00.444759 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b410270-7237-429f-bc0f-8d7986cef241-config-volume\") pod \"collect-profiles-29320305-v49ds\" (UID: \"1b410270-7237-429f-bc0f-8d7986cef241\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-v49ds" Sep 30 07:45:00 crc kubenswrapper[4760]: I0930 07:45:00.460383 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b410270-7237-429f-bc0f-8d7986cef241-secret-volume\") pod \"collect-profiles-29320305-v49ds\" (UID: \"1b410270-7237-429f-bc0f-8d7986cef241\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-v49ds" Sep 30 07:45:00 crc kubenswrapper[4760]: I0930 07:45:00.464748 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flkh6\" (UniqueName: \"kubernetes.io/projected/1b410270-7237-429f-bc0f-8d7986cef241-kube-api-access-flkh6\") pod \"collect-profiles-29320305-v49ds\" (UID: \"1b410270-7237-429f-bc0f-8d7986cef241\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-v49ds" Sep 30 07:45:00 crc kubenswrapper[4760]: I0930 07:45:00.491232 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-v49ds" Sep 30 07:45:00 crc kubenswrapper[4760]: I0930 07:45:00.553677 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-g68gd"] Sep 30 07:45:00 crc kubenswrapper[4760]: I0930 07:45:00.686255 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320305-v49ds"] Sep 30 07:45:01 crc kubenswrapper[4760]: I0930 07:45:01.225419 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-g68gd" event={"ID":"40eb0a4f-5fde-42fa-a5c0-283ccab9a683","Type":"ContainerStarted","Data":"a02478940f95760852fddfb26eb148c803e8e95f7292a0f311d6b1b74f33f488"} Sep 30 07:45:01 crc kubenswrapper[4760]: I0930 07:45:01.226912 4760 generic.go:334] "Generic (PLEG): container finished" podID="1b410270-7237-429f-bc0f-8d7986cef241" containerID="c3536e5adfdc0661a6265bb9e4fdb956c6f6a8bf2e616a07ec99bbdaa0f8e9af" exitCode=0 Sep 30 07:45:01 crc kubenswrapper[4760]: I0930 07:45:01.226964 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-v49ds" event={"ID":"1b410270-7237-429f-bc0f-8d7986cef241","Type":"ContainerDied","Data":"c3536e5adfdc0661a6265bb9e4fdb956c6f6a8bf2e616a07ec99bbdaa0f8e9af"} Sep 30 07:45:01 crc kubenswrapper[4760]: I0930 07:45:01.227041 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-v49ds" event={"ID":"1b410270-7237-429f-bc0f-8d7986cef241","Type":"ContainerStarted","Data":"f3068d92e9be06afcc50bc8bc26a27fc093c7d8eec20774ff1d01d465fd9e8b9"} Sep 30 07:45:02 crc kubenswrapper[4760]: I0930 07:45:02.704418 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-v49ds" Sep 30 07:45:02 crc kubenswrapper[4760]: I0930 07:45:02.870643 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flkh6\" (UniqueName: \"kubernetes.io/projected/1b410270-7237-429f-bc0f-8d7986cef241-kube-api-access-flkh6\") pod \"1b410270-7237-429f-bc0f-8d7986cef241\" (UID: \"1b410270-7237-429f-bc0f-8d7986cef241\") " Sep 30 07:45:02 crc kubenswrapper[4760]: I0930 07:45:02.870941 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b410270-7237-429f-bc0f-8d7986cef241-secret-volume\") pod \"1b410270-7237-429f-bc0f-8d7986cef241\" (UID: \"1b410270-7237-429f-bc0f-8d7986cef241\") " Sep 30 07:45:02 crc kubenswrapper[4760]: I0930 07:45:02.871086 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b410270-7237-429f-bc0f-8d7986cef241-config-volume\") pod \"1b410270-7237-429f-bc0f-8d7986cef241\" (UID: \"1b410270-7237-429f-bc0f-8d7986cef241\") " Sep 30 07:45:02 crc kubenswrapper[4760]: I0930 07:45:02.871602 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b410270-7237-429f-bc0f-8d7986cef241-config-volume" (OuterVolumeSpecName: "config-volume") pod "1b410270-7237-429f-bc0f-8d7986cef241" (UID: "1b410270-7237-429f-bc0f-8d7986cef241"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:45:02 crc kubenswrapper[4760]: I0930 07:45:02.881532 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b410270-7237-429f-bc0f-8d7986cef241-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1b410270-7237-429f-bc0f-8d7986cef241" (UID: "1b410270-7237-429f-bc0f-8d7986cef241"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:45:02 crc kubenswrapper[4760]: I0930 07:45:02.881548 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b410270-7237-429f-bc0f-8d7986cef241-kube-api-access-flkh6" (OuterVolumeSpecName: "kube-api-access-flkh6") pod "1b410270-7237-429f-bc0f-8d7986cef241" (UID: "1b410270-7237-429f-bc0f-8d7986cef241"). InnerVolumeSpecName "kube-api-access-flkh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:45:02 crc kubenswrapper[4760]: I0930 07:45:02.972437 4760 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b410270-7237-429f-bc0f-8d7986cef241-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 07:45:02 crc kubenswrapper[4760]: I0930 07:45:02.972472 4760 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b410270-7237-429f-bc0f-8d7986cef241-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 07:45:02 crc kubenswrapper[4760]: I0930 07:45:02.972481 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flkh6\" (UniqueName: \"kubernetes.io/projected/1b410270-7237-429f-bc0f-8d7986cef241-kube-api-access-flkh6\") on node \"crc\" DevicePath \"\"" Sep 30 07:45:03 crc kubenswrapper[4760]: I0930 07:45:03.243479 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-v49ds" event={"ID":"1b410270-7237-429f-bc0f-8d7986cef241","Type":"ContainerDied","Data":"f3068d92e9be06afcc50bc8bc26a27fc093c7d8eec20774ff1d01d465fd9e8b9"} Sep 30 07:45:03 crc kubenswrapper[4760]: I0930 07:45:03.243560 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3068d92e9be06afcc50bc8bc26a27fc093c7d8eec20774ff1d01d465fd9e8b9" Sep 30 07:45:03 crc kubenswrapper[4760]: I0930 07:45:03.243558 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-v49ds" Sep 30 07:45:03 crc kubenswrapper[4760]: I0930 07:45:03.246670 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-g68gd" event={"ID":"40eb0a4f-5fde-42fa-a5c0-283ccab9a683","Type":"ContainerStarted","Data":"b494d070ecfac092f4190f2b52756dba073de47c35918da72d38107864e00410"} Sep 30 07:45:03 crc kubenswrapper[4760]: I0930 07:45:03.270064 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-g68gd" podStartSLOduration=2.110980061 podStartE2EDuration="4.270049789s" podCreationTimestamp="2025-09-30 07:44:59 +0000 UTC" firstStartedPulling="2025-09-30 07:45:00.564144976 +0000 UTC m=+686.207051378" lastFinishedPulling="2025-09-30 07:45:02.723214694 +0000 UTC m=+688.366121106" observedRunningTime="2025-09-30 07:45:03.267605177 +0000 UTC m=+688.910511599" watchObservedRunningTime="2025-09-30 07:45:03.270049789 +0000 UTC m=+688.912956201" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.233592 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-lt9bl"] Sep 30 07:45:04 crc kubenswrapper[4760]: E0930 07:45:04.234050 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b410270-7237-429f-bc0f-8d7986cef241" containerName="collect-profiles" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.234061 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b410270-7237-429f-bc0f-8d7986cef241" containerName="collect-profiles" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.234165 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b410270-7237-429f-bc0f-8d7986cef241" containerName="collect-profiles" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.234734 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-lt9bl" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.236762 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-w6tv9" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.251225 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-lt9bl"] Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.260673 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-p594d"] Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.261739 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-p594d" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.263735 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.269600 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-fqhzn"] Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.270396 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-fqhzn" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.274592 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-p594d"] Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.361620 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kztb"] Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.362802 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kztb" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.367504 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.367979 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.368223 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-8wq84" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.377495 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kztb"] Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.387873 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lrl5\" (UniqueName: \"kubernetes.io/projected/9c04978e-fe0a-4324-b6ce-b9b6b70bf305-kube-api-access-6lrl5\") pod \"nmstate-handler-fqhzn\" (UID: \"9c04978e-fe0a-4324-b6ce-b9b6b70bf305\") " pod="openshift-nmstate/nmstate-handler-fqhzn" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.388199 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9c04978e-fe0a-4324-b6ce-b9b6b70bf305-dbus-socket\") pod \"nmstate-handler-fqhzn\" (UID: \"9c04978e-fe0a-4324-b6ce-b9b6b70bf305\") " pod="openshift-nmstate/nmstate-handler-fqhzn" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.388350 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q2x8\" (UniqueName: \"kubernetes.io/projected/5f924622-9974-450d-b3a1-bb5fc8100ad6-kube-api-access-6q2x8\") pod \"nmstate-metrics-58fcddf996-lt9bl\" (UID: \"5f924622-9974-450d-b3a1-bb5fc8100ad6\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-lt9bl" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.388457 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn5vc\" (UniqueName: \"kubernetes.io/projected/759b23d3-847f-4d3a-9141-5c2cfad8664b-kube-api-access-zn5vc\") pod \"nmstate-webhook-6d689559c5-p594d\" (UID: \"759b23d3-847f-4d3a-9141-5c2cfad8664b\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-p594d" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.388555 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9c04978e-fe0a-4324-b6ce-b9b6b70bf305-ovs-socket\") pod \"nmstate-handler-fqhzn\" (UID: \"9c04978e-fe0a-4324-b6ce-b9b6b70bf305\") " pod="openshift-nmstate/nmstate-handler-fqhzn" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.388683 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9c04978e-fe0a-4324-b6ce-b9b6b70bf305-nmstate-lock\") pod \"nmstate-handler-fqhzn\" (UID: \"9c04978e-fe0a-4324-b6ce-b9b6b70bf305\") " pod="openshift-nmstate/nmstate-handler-fqhzn" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.388791 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/759b23d3-847f-4d3a-9141-5c2cfad8664b-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-p594d\" (UID: \"759b23d3-847f-4d3a-9141-5c2cfad8664b\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-p594d" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.490725 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9c04978e-fe0a-4324-b6ce-b9b6b70bf305-nmstate-lock\") pod \"nmstate-handler-fqhzn\" (UID: \"9c04978e-fe0a-4324-b6ce-b9b6b70bf305\") " pod="openshift-nmstate/nmstate-handler-fqhzn" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.491006 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a755b893-8456-4ee5-88cd-6e38a665c659-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-9kztb\" (UID: \"a755b893-8456-4ee5-88cd-6e38a665c659\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kztb" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.491121 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/759b23d3-847f-4d3a-9141-5c2cfad8664b-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-p594d\" (UID: \"759b23d3-847f-4d3a-9141-5c2cfad8664b\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-p594d" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.491227 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a755b893-8456-4ee5-88cd-6e38a665c659-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-9kztb\" (UID: \"a755b893-8456-4ee5-88cd-6e38a665c659\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kztb" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.491366 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lrl5\" (UniqueName: \"kubernetes.io/projected/9c04978e-fe0a-4324-b6ce-b9b6b70bf305-kube-api-access-6lrl5\") pod \"nmstate-handler-fqhzn\" (UID: \"9c04978e-fe0a-4324-b6ce-b9b6b70bf305\") " pod="openshift-nmstate/nmstate-handler-fqhzn" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.491497 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9c04978e-fe0a-4324-b6ce-b9b6b70bf305-dbus-socket\") pod \"nmstate-handler-fqhzn\" (UID: \"9c04978e-fe0a-4324-b6ce-b9b6b70bf305\") " pod="openshift-nmstate/nmstate-handler-fqhzn" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.491754 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q2x8\" (UniqueName: \"kubernetes.io/projected/5f924622-9974-450d-b3a1-bb5fc8100ad6-kube-api-access-6q2x8\") pod \"nmstate-metrics-58fcddf996-lt9bl\" (UID: \"5f924622-9974-450d-b3a1-bb5fc8100ad6\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-lt9bl" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.491838 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvjxz\" (UniqueName: \"kubernetes.io/projected/a755b893-8456-4ee5-88cd-6e38a665c659-kube-api-access-fvjxz\") pod \"nmstate-console-plugin-864bb6dfb5-9kztb\" (UID: \"a755b893-8456-4ee5-88cd-6e38a665c659\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kztb" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.491936 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn5vc\" (UniqueName: \"kubernetes.io/projected/759b23d3-847f-4d3a-9141-5c2cfad8664b-kube-api-access-zn5vc\") pod \"nmstate-webhook-6d689559c5-p594d\" (UID: \"759b23d3-847f-4d3a-9141-5c2cfad8664b\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-p594d" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.492040 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9c04978e-fe0a-4324-b6ce-b9b6b70bf305-ovs-socket\") pod \"nmstate-handler-fqhzn\" (UID: \"9c04978e-fe0a-4324-b6ce-b9b6b70bf305\") " pod="openshift-nmstate/nmstate-handler-fqhzn" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.492138 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9c04978e-fe0a-4324-b6ce-b9b6b70bf305-ovs-socket\") pod \"nmstate-handler-fqhzn\" (UID: \"9c04978e-fe0a-4324-b6ce-b9b6b70bf305\") " pod="openshift-nmstate/nmstate-handler-fqhzn" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.491714 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9c04978e-fe0a-4324-b6ce-b9b6b70bf305-dbus-socket\") pod \"nmstate-handler-fqhzn\" (UID: \"9c04978e-fe0a-4324-b6ce-b9b6b70bf305\") " pod="openshift-nmstate/nmstate-handler-fqhzn" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.490955 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9c04978e-fe0a-4324-b6ce-b9b6b70bf305-nmstate-lock\") pod \"nmstate-handler-fqhzn\" (UID: \"9c04978e-fe0a-4324-b6ce-b9b6b70bf305\") " pod="openshift-nmstate/nmstate-handler-fqhzn" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.498767 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/759b23d3-847f-4d3a-9141-5c2cfad8664b-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-p594d\" (UID: \"759b23d3-847f-4d3a-9141-5c2cfad8664b\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-p594d" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.508826 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn5vc\" (UniqueName: \"kubernetes.io/projected/759b23d3-847f-4d3a-9141-5c2cfad8664b-kube-api-access-zn5vc\") pod \"nmstate-webhook-6d689559c5-p594d\" (UID: \"759b23d3-847f-4d3a-9141-5c2cfad8664b\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-p594d" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.515552 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lrl5\" (UniqueName: \"kubernetes.io/projected/9c04978e-fe0a-4324-b6ce-b9b6b70bf305-kube-api-access-6lrl5\") pod \"nmstate-handler-fqhzn\" (UID: \"9c04978e-fe0a-4324-b6ce-b9b6b70bf305\") " pod="openshift-nmstate/nmstate-handler-fqhzn" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.518579 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q2x8\" (UniqueName: \"kubernetes.io/projected/5f924622-9974-450d-b3a1-bb5fc8100ad6-kube-api-access-6q2x8\") pod \"nmstate-metrics-58fcddf996-lt9bl\" (UID: \"5f924622-9974-450d-b3a1-bb5fc8100ad6\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-lt9bl" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.550188 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-lt9bl" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.566544 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-79476f9bb4-m7gq7"] Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.567662 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79476f9bb4-m7gq7" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.571674 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79476f9bb4-m7gq7"] Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.575966 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-p594d" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.584883 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-fqhzn" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.593034 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvjxz\" (UniqueName: \"kubernetes.io/projected/a755b893-8456-4ee5-88cd-6e38a665c659-kube-api-access-fvjxz\") pod \"nmstate-console-plugin-864bb6dfb5-9kztb\" (UID: \"a755b893-8456-4ee5-88cd-6e38a665c659\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kztb" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.593170 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a755b893-8456-4ee5-88cd-6e38a665c659-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-9kztb\" (UID: \"a755b893-8456-4ee5-88cd-6e38a665c659\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kztb" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.593245 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a755b893-8456-4ee5-88cd-6e38a665c659-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-9kztb\" (UID: \"a755b893-8456-4ee5-88cd-6e38a665c659\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kztb" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.594739 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a755b893-8456-4ee5-88cd-6e38a665c659-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-9kztb\" (UID: \"a755b893-8456-4ee5-88cd-6e38a665c659\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kztb" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.598673 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a755b893-8456-4ee5-88cd-6e38a665c659-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-9kztb\" (UID: \"a755b893-8456-4ee5-88cd-6e38a665c659\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kztb" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.612436 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvjxz\" (UniqueName: \"kubernetes.io/projected/a755b893-8456-4ee5-88cd-6e38a665c659-kube-api-access-fvjxz\") pod \"nmstate-console-plugin-864bb6dfb5-9kztb\" (UID: \"a755b893-8456-4ee5-88cd-6e38a665c659\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kztb" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.683660 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kztb" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.694950 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pc2h\" (UniqueName: \"kubernetes.io/projected/bda985fd-1495-44a1-99b1-e554b9a2ab64-kube-api-access-4pc2h\") pod \"console-79476f9bb4-m7gq7\" (UID: \"bda985fd-1495-44a1-99b1-e554b9a2ab64\") " pod="openshift-console/console-79476f9bb4-m7gq7" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.694999 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bda985fd-1495-44a1-99b1-e554b9a2ab64-service-ca\") pod \"console-79476f9bb4-m7gq7\" (UID: \"bda985fd-1495-44a1-99b1-e554b9a2ab64\") " pod="openshift-console/console-79476f9bb4-m7gq7" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.695025 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bda985fd-1495-44a1-99b1-e554b9a2ab64-console-oauth-config\") pod \"console-79476f9bb4-m7gq7\" (UID: \"bda985fd-1495-44a1-99b1-e554b9a2ab64\") " pod="openshift-console/console-79476f9bb4-m7gq7" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.695204 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bda985fd-1495-44a1-99b1-e554b9a2ab64-oauth-serving-cert\") pod \"console-79476f9bb4-m7gq7\" (UID: \"bda985fd-1495-44a1-99b1-e554b9a2ab64\") " pod="openshift-console/console-79476f9bb4-m7gq7" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.695259 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bda985fd-1495-44a1-99b1-e554b9a2ab64-trusted-ca-bundle\") pod \"console-79476f9bb4-m7gq7\" (UID: \"bda985fd-1495-44a1-99b1-e554b9a2ab64\") " pod="openshift-console/console-79476f9bb4-m7gq7" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.695361 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bda985fd-1495-44a1-99b1-e554b9a2ab64-console-serving-cert\") pod \"console-79476f9bb4-m7gq7\" (UID: \"bda985fd-1495-44a1-99b1-e554b9a2ab64\") " pod="openshift-console/console-79476f9bb4-m7gq7" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.695386 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bda985fd-1495-44a1-99b1-e554b9a2ab64-console-config\") pod \"console-79476f9bb4-m7gq7\" (UID: \"bda985fd-1495-44a1-99b1-e554b9a2ab64\") " pod="openshift-console/console-79476f9bb4-m7gq7" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.796636 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pc2h\" (UniqueName: \"kubernetes.io/projected/bda985fd-1495-44a1-99b1-e554b9a2ab64-kube-api-access-4pc2h\") pod \"console-79476f9bb4-m7gq7\" (UID: \"bda985fd-1495-44a1-99b1-e554b9a2ab64\") " pod="openshift-console/console-79476f9bb4-m7gq7" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.796680 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bda985fd-1495-44a1-99b1-e554b9a2ab64-service-ca\") pod \"console-79476f9bb4-m7gq7\" (UID: \"bda985fd-1495-44a1-99b1-e554b9a2ab64\") " pod="openshift-console/console-79476f9bb4-m7gq7" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.796712 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bda985fd-1495-44a1-99b1-e554b9a2ab64-console-oauth-config\") pod \"console-79476f9bb4-m7gq7\" (UID: \"bda985fd-1495-44a1-99b1-e554b9a2ab64\") " pod="openshift-console/console-79476f9bb4-m7gq7" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.796763 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bda985fd-1495-44a1-99b1-e554b9a2ab64-oauth-serving-cert\") pod \"console-79476f9bb4-m7gq7\" (UID: \"bda985fd-1495-44a1-99b1-e554b9a2ab64\") " pod="openshift-console/console-79476f9bb4-m7gq7" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.796790 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bda985fd-1495-44a1-99b1-e554b9a2ab64-trusted-ca-bundle\") pod \"console-79476f9bb4-m7gq7\" (UID: \"bda985fd-1495-44a1-99b1-e554b9a2ab64\") " pod="openshift-console/console-79476f9bb4-m7gq7" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.796834 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bda985fd-1495-44a1-99b1-e554b9a2ab64-console-serving-cert\") pod \"console-79476f9bb4-m7gq7\" (UID: \"bda985fd-1495-44a1-99b1-e554b9a2ab64\") " pod="openshift-console/console-79476f9bb4-m7gq7" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.796853 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bda985fd-1495-44a1-99b1-e554b9a2ab64-console-config\") pod \"console-79476f9bb4-m7gq7\" (UID: \"bda985fd-1495-44a1-99b1-e554b9a2ab64\") " pod="openshift-console/console-79476f9bb4-m7gq7" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.798478 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bda985fd-1495-44a1-99b1-e554b9a2ab64-service-ca\") pod \"console-79476f9bb4-m7gq7\" (UID: \"bda985fd-1495-44a1-99b1-e554b9a2ab64\") " pod="openshift-console/console-79476f9bb4-m7gq7" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.798639 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bda985fd-1495-44a1-99b1-e554b9a2ab64-oauth-serving-cert\") pod \"console-79476f9bb4-m7gq7\" (UID: \"bda985fd-1495-44a1-99b1-e554b9a2ab64\") " pod="openshift-console/console-79476f9bb4-m7gq7" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.798861 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bda985fd-1495-44a1-99b1-e554b9a2ab64-trusted-ca-bundle\") pod \"console-79476f9bb4-m7gq7\" (UID: \"bda985fd-1495-44a1-99b1-e554b9a2ab64\") " pod="openshift-console/console-79476f9bb4-m7gq7" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.799024 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bda985fd-1495-44a1-99b1-e554b9a2ab64-console-config\") pod \"console-79476f9bb4-m7gq7\" (UID: \"bda985fd-1495-44a1-99b1-e554b9a2ab64\") " pod="openshift-console/console-79476f9bb4-m7gq7" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.800835 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bda985fd-1495-44a1-99b1-e554b9a2ab64-console-serving-cert\") pod \"console-79476f9bb4-m7gq7\" (UID: \"bda985fd-1495-44a1-99b1-e554b9a2ab64\") " pod="openshift-console/console-79476f9bb4-m7gq7" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.801786 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bda985fd-1495-44a1-99b1-e554b9a2ab64-console-oauth-config\") pod \"console-79476f9bb4-m7gq7\" (UID: \"bda985fd-1495-44a1-99b1-e554b9a2ab64\") " pod="openshift-console/console-79476f9bb4-m7gq7" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.816208 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pc2h\" (UniqueName: \"kubernetes.io/projected/bda985fd-1495-44a1-99b1-e554b9a2ab64-kube-api-access-4pc2h\") pod \"console-79476f9bb4-m7gq7\" (UID: \"bda985fd-1495-44a1-99b1-e554b9a2ab64\") " pod="openshift-console/console-79476f9bb4-m7gq7" Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.847343 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-p594d"] Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.910824 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kztb"] Sep 30 07:45:04 crc kubenswrapper[4760]: I0930 07:45:04.912160 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79476f9bb4-m7gq7" Sep 30 07:45:05 crc kubenswrapper[4760]: I0930 07:45:05.003960 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-lt9bl"] Sep 30 07:45:05 crc kubenswrapper[4760]: I0930 07:45:05.173713 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79476f9bb4-m7gq7"] Sep 30 07:45:05 crc kubenswrapper[4760]: W0930 07:45:05.179426 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbda985fd_1495_44a1_99b1_e554b9a2ab64.slice/crio-c217f1d245dfaf4e1275ce4deb5d8040fe8925ca7e99c1cc1ce1a82ea3a9e2c8 WatchSource:0}: Error finding container c217f1d245dfaf4e1275ce4deb5d8040fe8925ca7e99c1cc1ce1a82ea3a9e2c8: Status 404 returned error can't find the container with id c217f1d245dfaf4e1275ce4deb5d8040fe8925ca7e99c1cc1ce1a82ea3a9e2c8 Sep 30 07:45:05 crc kubenswrapper[4760]: I0930 07:45:05.257240 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-lt9bl" event={"ID":"5f924622-9974-450d-b3a1-bb5fc8100ad6","Type":"ContainerStarted","Data":"9d6a58e62dba772cd2939c9d8a4923e0469dd0b4ccdc68ece5973b1cd303072e"} Sep 30 07:45:05 crc kubenswrapper[4760]: I0930 07:45:05.258360 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-fqhzn" event={"ID":"9c04978e-fe0a-4324-b6ce-b9b6b70bf305","Type":"ContainerStarted","Data":"f682721e1bd03cdb282666a9faba67278e1512e81c210f986f74a6ad8908f915"} Sep 30 07:45:05 crc kubenswrapper[4760]: I0930 07:45:05.259369 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79476f9bb4-m7gq7" event={"ID":"bda985fd-1495-44a1-99b1-e554b9a2ab64","Type":"ContainerStarted","Data":"c217f1d245dfaf4e1275ce4deb5d8040fe8925ca7e99c1cc1ce1a82ea3a9e2c8"} Sep 30 07:45:05 crc kubenswrapper[4760]: I0930 07:45:05.260283 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-p594d" event={"ID":"759b23d3-847f-4d3a-9141-5c2cfad8664b","Type":"ContainerStarted","Data":"4d1bfaa2e7cde9bbb42f6cb93146000f125e18939996839dacfcceffc83e15cb"} Sep 30 07:45:05 crc kubenswrapper[4760]: I0930 07:45:05.261550 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kztb" event={"ID":"a755b893-8456-4ee5-88cd-6e38a665c659","Type":"ContainerStarted","Data":"646666af891450f1e29b38c16f9980a409011901bd248031b8d724a836a65dc9"} Sep 30 07:45:06 crc kubenswrapper[4760]: I0930 07:45:06.272107 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79476f9bb4-m7gq7" event={"ID":"bda985fd-1495-44a1-99b1-e554b9a2ab64","Type":"ContainerStarted","Data":"0f4048f8765a9cdcc45f525eb3e39e86d71b47cb5b304f77aede51d8b2728a8c"} Sep 30 07:45:06 crc kubenswrapper[4760]: I0930 07:45:06.294523 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-79476f9bb4-m7gq7" podStartSLOduration=2.294497708 podStartE2EDuration="2.294497708s" podCreationTimestamp="2025-09-30 07:45:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:45:06.287994412 +0000 UTC m=+691.930900814" watchObservedRunningTime="2025-09-30 07:45:06.294497708 +0000 UTC m=+691.937404120" Sep 30 07:45:08 crc kubenswrapper[4760]: I0930 07:45:08.281960 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-p594d" event={"ID":"759b23d3-847f-4d3a-9141-5c2cfad8664b","Type":"ContainerStarted","Data":"f9d9d5f962366c05611f369361d8b8bd416be49bb8beda84f29d251589a619cf"} Sep 30 07:45:08 crc kubenswrapper[4760]: I0930 07:45:08.282525 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6d689559c5-p594d" Sep 30 07:45:08 crc kubenswrapper[4760]: I0930 07:45:08.283741 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kztb" event={"ID":"a755b893-8456-4ee5-88cd-6e38a665c659","Type":"ContainerStarted","Data":"d32937b9e0cc472ff844a1da636089f7d786995ee14efe5b6411ff78c66f97b7"} Sep 30 07:45:08 crc kubenswrapper[4760]: I0930 07:45:08.288234 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-lt9bl" event={"ID":"5f924622-9974-450d-b3a1-bb5fc8100ad6","Type":"ContainerStarted","Data":"f57916cf27486ffa899b0b7a64a8aa3e139901797da90f1c6796d026d867c646"} Sep 30 07:45:08 crc kubenswrapper[4760]: I0930 07:45:08.289281 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-fqhzn" event={"ID":"9c04978e-fe0a-4324-b6ce-b9b6b70bf305","Type":"ContainerStarted","Data":"6910ab6612cb4237f9d6aa4140fbaf8fd13099162b8536d5cce6a1a904261b34"} Sep 30 07:45:08 crc kubenswrapper[4760]: I0930 07:45:08.289635 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-fqhzn" Sep 30 07:45:08 crc kubenswrapper[4760]: I0930 07:45:08.300393 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6d689559c5-p594d" podStartSLOduration=1.391361358 podStartE2EDuration="4.300374499s" podCreationTimestamp="2025-09-30 07:45:04 +0000 UTC" firstStartedPulling="2025-09-30 07:45:04.860213482 +0000 UTC m=+690.503119894" lastFinishedPulling="2025-09-30 07:45:07.769226613 +0000 UTC m=+693.412133035" observedRunningTime="2025-09-30 07:45:08.29846795 +0000 UTC m=+693.941374382" watchObservedRunningTime="2025-09-30 07:45:08.300374499 +0000 UTC m=+693.943280911" Sep 30 07:45:08 crc kubenswrapper[4760]: I0930 07:45:08.320251 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-fqhzn" podStartSLOduration=1.1837737050000001 podStartE2EDuration="4.320235404s" podCreationTimestamp="2025-09-30 07:45:04 +0000 UTC" firstStartedPulling="2025-09-30 07:45:04.636646653 +0000 UTC m=+690.279553065" lastFinishedPulling="2025-09-30 07:45:07.773108282 +0000 UTC m=+693.416014764" observedRunningTime="2025-09-30 07:45:08.31734034 +0000 UTC m=+693.960246752" watchObservedRunningTime="2025-09-30 07:45:08.320235404 +0000 UTC m=+693.963141816" Sep 30 07:45:08 crc kubenswrapper[4760]: I0930 07:45:08.333790 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-9kztb" podStartSLOduration=1.490470449 podStartE2EDuration="4.333758768s" podCreationTimestamp="2025-09-30 07:45:04 +0000 UTC" firstStartedPulling="2025-09-30 07:45:04.918416013 +0000 UTC m=+690.561322425" lastFinishedPulling="2025-09-30 07:45:07.761704322 +0000 UTC m=+693.404610744" observedRunningTime="2025-09-30 07:45:08.332373893 +0000 UTC m=+693.975280305" watchObservedRunningTime="2025-09-30 07:45:08.333758768 +0000 UTC m=+693.976665180" Sep 30 07:45:11 crc kubenswrapper[4760]: I0930 07:45:11.313252 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-lt9bl" event={"ID":"5f924622-9974-450d-b3a1-bb5fc8100ad6","Type":"ContainerStarted","Data":"adeb99906db1f4174a505e10e5ff5b931cf8f5e74169da8ccc13679971fdfe71"} Sep 30 07:45:11 crc kubenswrapper[4760]: I0930 07:45:11.335242 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58fcddf996-lt9bl" podStartSLOduration=1.987386874 podStartE2EDuration="7.335214072s" podCreationTimestamp="2025-09-30 07:45:04 +0000 UTC" firstStartedPulling="2025-09-30 07:45:05.020116671 +0000 UTC m=+690.663023073" lastFinishedPulling="2025-09-30 07:45:10.367943819 +0000 UTC m=+696.010850271" observedRunningTime="2025-09-30 07:45:11.333213551 +0000 UTC m=+696.976119993" watchObservedRunningTime="2025-09-30 07:45:11.335214072 +0000 UTC m=+696.978120514" Sep 30 07:45:14 crc kubenswrapper[4760]: I0930 07:45:14.608163 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-fqhzn" Sep 30 07:45:14 crc kubenswrapper[4760]: I0930 07:45:14.913260 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-79476f9bb4-m7gq7" Sep 30 07:45:14 crc kubenswrapper[4760]: I0930 07:45:14.913709 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-79476f9bb4-m7gq7" Sep 30 07:45:14 crc kubenswrapper[4760]: I0930 07:45:14.922498 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-79476f9bb4-m7gq7" Sep 30 07:45:15 crc kubenswrapper[4760]: I0930 07:45:15.347656 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-79476f9bb4-m7gq7" Sep 30 07:45:15 crc kubenswrapper[4760]: I0930 07:45:15.412898 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ntgr2"] Sep 30 07:45:19 crc kubenswrapper[4760]: I0930 07:45:19.113020 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:45:19 crc kubenswrapper[4760]: I0930 07:45:19.113929 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:45:24 crc kubenswrapper[4760]: I0930 07:45:24.589628 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6d689559c5-p594d" Sep 30 07:45:40 crc kubenswrapper[4760]: I0930 07:45:40.485470 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-ntgr2" podUID="08d362f3-5c04-45fe-9981-ada11b028f83" containerName="console" containerID="cri-o://e449778f7af6943855cd2606785aa2721d60c5eee2bb35b3fc070fa9165cc05c" gracePeriod=15 Sep 30 07:45:40 crc kubenswrapper[4760]: I0930 07:45:40.844188 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ntgr2_08d362f3-5c04-45fe-9981-ada11b028f83/console/0.log" Sep 30 07:45:40 crc kubenswrapper[4760]: I0930 07:45:40.844514 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.012833 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7zxl\" (UniqueName: \"kubernetes.io/projected/08d362f3-5c04-45fe-9981-ada11b028f83-kube-api-access-c7zxl\") pod \"08d362f3-5c04-45fe-9981-ada11b028f83\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.012898 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/08d362f3-5c04-45fe-9981-ada11b028f83-service-ca\") pod \"08d362f3-5c04-45fe-9981-ada11b028f83\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.012940 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/08d362f3-5c04-45fe-9981-ada11b028f83-console-oauth-config\") pod \"08d362f3-5c04-45fe-9981-ada11b028f83\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.012997 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/08d362f3-5c04-45fe-9981-ada11b028f83-console-config\") pod \"08d362f3-5c04-45fe-9981-ada11b028f83\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.013022 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/08d362f3-5c04-45fe-9981-ada11b028f83-oauth-serving-cert\") pod \"08d362f3-5c04-45fe-9981-ada11b028f83\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.013984 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08d362f3-5c04-45fe-9981-ada11b028f83-service-ca" (OuterVolumeSpecName: "service-ca") pod "08d362f3-5c04-45fe-9981-ada11b028f83" (UID: "08d362f3-5c04-45fe-9981-ada11b028f83"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.013996 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/08d362f3-5c04-45fe-9981-ada11b028f83-console-serving-cert\") pod \"08d362f3-5c04-45fe-9981-ada11b028f83\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.014018 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08d362f3-5c04-45fe-9981-ada11b028f83-trusted-ca-bundle\") pod \"08d362f3-5c04-45fe-9981-ada11b028f83\" (UID: \"08d362f3-5c04-45fe-9981-ada11b028f83\") " Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.014011 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08d362f3-5c04-45fe-9981-ada11b028f83-console-config" (OuterVolumeSpecName: "console-config") pod "08d362f3-5c04-45fe-9981-ada11b028f83" (UID: "08d362f3-5c04-45fe-9981-ada11b028f83"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.014081 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08d362f3-5c04-45fe-9981-ada11b028f83-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "08d362f3-5c04-45fe-9981-ada11b028f83" (UID: "08d362f3-5c04-45fe-9981-ada11b028f83"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.014512 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08d362f3-5c04-45fe-9981-ada11b028f83-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "08d362f3-5c04-45fe-9981-ada11b028f83" (UID: "08d362f3-5c04-45fe-9981-ada11b028f83"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.014873 4760 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/08d362f3-5c04-45fe-9981-ada11b028f83-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.014909 4760 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/08d362f3-5c04-45fe-9981-ada11b028f83-console-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.014932 4760 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/08d362f3-5c04-45fe-9981-ada11b028f83-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.014952 4760 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08d362f3-5c04-45fe-9981-ada11b028f83-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.019428 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08d362f3-5c04-45fe-9981-ada11b028f83-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "08d362f3-5c04-45fe-9981-ada11b028f83" (UID: "08d362f3-5c04-45fe-9981-ada11b028f83"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.019818 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08d362f3-5c04-45fe-9981-ada11b028f83-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "08d362f3-5c04-45fe-9981-ada11b028f83" (UID: "08d362f3-5c04-45fe-9981-ada11b028f83"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.025954 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08d362f3-5c04-45fe-9981-ada11b028f83-kube-api-access-c7zxl" (OuterVolumeSpecName: "kube-api-access-c7zxl") pod "08d362f3-5c04-45fe-9981-ada11b028f83" (UID: "08d362f3-5c04-45fe-9981-ada11b028f83"). InnerVolumeSpecName "kube-api-access-c7zxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.117083 4760 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/08d362f3-5c04-45fe-9981-ada11b028f83-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.117123 4760 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/08d362f3-5c04-45fe-9981-ada11b028f83-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.117138 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7zxl\" (UniqueName: \"kubernetes.io/projected/08d362f3-5c04-45fe-9981-ada11b028f83-kube-api-access-c7zxl\") on node \"crc\" DevicePath \"\"" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.528282 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ntgr2_08d362f3-5c04-45fe-9981-ada11b028f83/console/0.log" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.528371 4760 generic.go:334] "Generic (PLEG): container finished" podID="08d362f3-5c04-45fe-9981-ada11b028f83" containerID="e449778f7af6943855cd2606785aa2721d60c5eee2bb35b3fc070fa9165cc05c" exitCode=2 Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.528404 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ntgr2" event={"ID":"08d362f3-5c04-45fe-9981-ada11b028f83","Type":"ContainerDied","Data":"e449778f7af6943855cd2606785aa2721d60c5eee2bb35b3fc070fa9165cc05c"} Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.528434 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ntgr2" event={"ID":"08d362f3-5c04-45fe-9981-ada11b028f83","Type":"ContainerDied","Data":"c70666f53dc49292dc221d6fc7071042cbe8f4b5eab3d617f6714de809f462ae"} Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.528454 4760 scope.go:117] "RemoveContainer" containerID="e449778f7af6943855cd2606785aa2721d60c5eee2bb35b3fc070fa9165cc05c" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.528796 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ntgr2" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.556362 4760 scope.go:117] "RemoveContainer" containerID="e449778f7af6943855cd2606785aa2721d60c5eee2bb35b3fc070fa9165cc05c" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.557273 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ntgr2"] Sep 30 07:45:41 crc kubenswrapper[4760]: E0930 07:45:41.557727 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e449778f7af6943855cd2606785aa2721d60c5eee2bb35b3fc070fa9165cc05c\": container with ID starting with e449778f7af6943855cd2606785aa2721d60c5eee2bb35b3fc070fa9165cc05c not found: ID does not exist" containerID="e449778f7af6943855cd2606785aa2721d60c5eee2bb35b3fc070fa9165cc05c" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.557771 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e449778f7af6943855cd2606785aa2721d60c5eee2bb35b3fc070fa9165cc05c"} err="failed to get container status \"e449778f7af6943855cd2606785aa2721d60c5eee2bb35b3fc070fa9165cc05c\": rpc error: code = NotFound desc = could not find container \"e449778f7af6943855cd2606785aa2721d60c5eee2bb35b3fc070fa9165cc05c\": container with ID starting with e449778f7af6943855cd2606785aa2721d60c5eee2bb35b3fc070fa9165cc05c not found: ID does not exist" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.561205 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-ntgr2"] Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.607490 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh"] Sep 30 07:45:41 crc kubenswrapper[4760]: E0930 07:45:41.607788 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08d362f3-5c04-45fe-9981-ada11b028f83" containerName="console" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.607812 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="08d362f3-5c04-45fe-9981-ada11b028f83" containerName="console" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.607907 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="08d362f3-5c04-45fe-9981-ada11b028f83" containerName="console" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.608636 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.611047 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.619372 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh"] Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.724956 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbxxm\" (UniqueName: \"kubernetes.io/projected/a0b17021-6ad1-473c-ba06-7d4ba8eb162a-kube-api-access-lbxxm\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh\" (UID: \"a0b17021-6ad1-473c-ba06-7d4ba8eb162a\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.725040 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0b17021-6ad1-473c-ba06-7d4ba8eb162a-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh\" (UID: \"a0b17021-6ad1-473c-ba06-7d4ba8eb162a\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.725123 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0b17021-6ad1-473c-ba06-7d4ba8eb162a-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh\" (UID: \"a0b17021-6ad1-473c-ba06-7d4ba8eb162a\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.826549 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbxxm\" (UniqueName: \"kubernetes.io/projected/a0b17021-6ad1-473c-ba06-7d4ba8eb162a-kube-api-access-lbxxm\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh\" (UID: \"a0b17021-6ad1-473c-ba06-7d4ba8eb162a\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.826783 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0b17021-6ad1-473c-ba06-7d4ba8eb162a-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh\" (UID: \"a0b17021-6ad1-473c-ba06-7d4ba8eb162a\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.826917 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0b17021-6ad1-473c-ba06-7d4ba8eb162a-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh\" (UID: \"a0b17021-6ad1-473c-ba06-7d4ba8eb162a\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.827447 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0b17021-6ad1-473c-ba06-7d4ba8eb162a-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh\" (UID: \"a0b17021-6ad1-473c-ba06-7d4ba8eb162a\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.827482 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0b17021-6ad1-473c-ba06-7d4ba8eb162a-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh\" (UID: \"a0b17021-6ad1-473c-ba06-7d4ba8eb162a\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.852933 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbxxm\" (UniqueName: \"kubernetes.io/projected/a0b17021-6ad1-473c-ba06-7d4ba8eb162a-kube-api-access-lbxxm\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh\" (UID: \"a0b17021-6ad1-473c-ba06-7d4ba8eb162a\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh" Sep 30 07:45:41 crc kubenswrapper[4760]: I0930 07:45:41.922495 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh" Sep 30 07:45:42 crc kubenswrapper[4760]: I0930 07:45:42.314426 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh"] Sep 30 07:45:42 crc kubenswrapper[4760]: I0930 07:45:42.536565 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh" event={"ID":"a0b17021-6ad1-473c-ba06-7d4ba8eb162a","Type":"ContainerStarted","Data":"43ed39eebedd75af68c3950eaece303b811df55a81144c3fa20f95e06b76c025"} Sep 30 07:45:42 crc kubenswrapper[4760]: I0930 07:45:42.536611 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh" event={"ID":"a0b17021-6ad1-473c-ba06-7d4ba8eb162a","Type":"ContainerStarted","Data":"f694f087160234a98d73245d37b8f99329bd17007ae1b26851cb13d2b9288f57"} Sep 30 07:45:43 crc kubenswrapper[4760]: I0930 07:45:43.080120 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08d362f3-5c04-45fe-9981-ada11b028f83" path="/var/lib/kubelet/pods/08d362f3-5c04-45fe-9981-ada11b028f83/volumes" Sep 30 07:45:43 crc kubenswrapper[4760]: I0930 07:45:43.546740 4760 generic.go:334] "Generic (PLEG): container finished" podID="a0b17021-6ad1-473c-ba06-7d4ba8eb162a" containerID="43ed39eebedd75af68c3950eaece303b811df55a81144c3fa20f95e06b76c025" exitCode=0 Sep 30 07:45:43 crc kubenswrapper[4760]: I0930 07:45:43.546863 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh" event={"ID":"a0b17021-6ad1-473c-ba06-7d4ba8eb162a","Type":"ContainerDied","Data":"43ed39eebedd75af68c3950eaece303b811df55a81144c3fa20f95e06b76c025"} Sep 30 07:45:45 crc kubenswrapper[4760]: I0930 07:45:45.565997 4760 generic.go:334] "Generic (PLEG): container finished" podID="a0b17021-6ad1-473c-ba06-7d4ba8eb162a" containerID="de6dcf402a27f995af1375f1f801347b534044be6bfa41ac8b0b6aebb9c160ec" exitCode=0 Sep 30 07:45:45 crc kubenswrapper[4760]: I0930 07:45:45.566105 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh" event={"ID":"a0b17021-6ad1-473c-ba06-7d4ba8eb162a","Type":"ContainerDied","Data":"de6dcf402a27f995af1375f1f801347b534044be6bfa41ac8b0b6aebb9c160ec"} Sep 30 07:45:46 crc kubenswrapper[4760]: I0930 07:45:46.574012 4760 generic.go:334] "Generic (PLEG): container finished" podID="a0b17021-6ad1-473c-ba06-7d4ba8eb162a" containerID="ce4cd5a843d4b8a2b77214c891c15b3fe09e75d2b9a187a2941ffc170fcb9b1e" exitCode=0 Sep 30 07:45:46 crc kubenswrapper[4760]: I0930 07:45:46.574052 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh" event={"ID":"a0b17021-6ad1-473c-ba06-7d4ba8eb162a","Type":"ContainerDied","Data":"ce4cd5a843d4b8a2b77214c891c15b3fe09e75d2b9a187a2941ffc170fcb9b1e"} Sep 30 07:45:47 crc kubenswrapper[4760]: I0930 07:45:47.895868 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh" Sep 30 07:45:48 crc kubenswrapper[4760]: I0930 07:45:48.024927 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0b17021-6ad1-473c-ba06-7d4ba8eb162a-util\") pod \"a0b17021-6ad1-473c-ba06-7d4ba8eb162a\" (UID: \"a0b17021-6ad1-473c-ba06-7d4ba8eb162a\") " Sep 30 07:45:48 crc kubenswrapper[4760]: I0930 07:45:48.025022 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0b17021-6ad1-473c-ba06-7d4ba8eb162a-bundle\") pod \"a0b17021-6ad1-473c-ba06-7d4ba8eb162a\" (UID: \"a0b17021-6ad1-473c-ba06-7d4ba8eb162a\") " Sep 30 07:45:48 crc kubenswrapper[4760]: I0930 07:45:48.025085 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbxxm\" (UniqueName: \"kubernetes.io/projected/a0b17021-6ad1-473c-ba06-7d4ba8eb162a-kube-api-access-lbxxm\") pod \"a0b17021-6ad1-473c-ba06-7d4ba8eb162a\" (UID: \"a0b17021-6ad1-473c-ba06-7d4ba8eb162a\") " Sep 30 07:45:48 crc kubenswrapper[4760]: I0930 07:45:48.027943 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0b17021-6ad1-473c-ba06-7d4ba8eb162a-bundle" (OuterVolumeSpecName: "bundle") pod "a0b17021-6ad1-473c-ba06-7d4ba8eb162a" (UID: "a0b17021-6ad1-473c-ba06-7d4ba8eb162a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:45:48 crc kubenswrapper[4760]: I0930 07:45:48.033773 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0b17021-6ad1-473c-ba06-7d4ba8eb162a-kube-api-access-lbxxm" (OuterVolumeSpecName: "kube-api-access-lbxxm") pod "a0b17021-6ad1-473c-ba06-7d4ba8eb162a" (UID: "a0b17021-6ad1-473c-ba06-7d4ba8eb162a"). InnerVolumeSpecName "kube-api-access-lbxxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:45:48 crc kubenswrapper[4760]: I0930 07:45:48.053042 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0b17021-6ad1-473c-ba06-7d4ba8eb162a-util" (OuterVolumeSpecName: "util") pod "a0b17021-6ad1-473c-ba06-7d4ba8eb162a" (UID: "a0b17021-6ad1-473c-ba06-7d4ba8eb162a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:45:48 crc kubenswrapper[4760]: I0930 07:45:48.126779 4760 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0b17021-6ad1-473c-ba06-7d4ba8eb162a-util\") on node \"crc\" DevicePath \"\"" Sep 30 07:45:48 crc kubenswrapper[4760]: I0930 07:45:48.126848 4760 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0b17021-6ad1-473c-ba06-7d4ba8eb162a-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:45:48 crc kubenswrapper[4760]: I0930 07:45:48.126876 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbxxm\" (UniqueName: \"kubernetes.io/projected/a0b17021-6ad1-473c-ba06-7d4ba8eb162a-kube-api-access-lbxxm\") on node \"crc\" DevicePath \"\"" Sep 30 07:45:48 crc kubenswrapper[4760]: I0930 07:45:48.593735 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh" event={"ID":"a0b17021-6ad1-473c-ba06-7d4ba8eb162a","Type":"ContainerDied","Data":"f694f087160234a98d73245d37b8f99329bd17007ae1b26851cb13d2b9288f57"} Sep 30 07:45:48 crc kubenswrapper[4760]: I0930 07:45:48.593787 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f694f087160234a98d73245d37b8f99329bd17007ae1b26851cb13d2b9288f57" Sep 30 07:45:48 crc kubenswrapper[4760]: I0930 07:45:48.593898 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh" Sep 30 07:45:49 crc kubenswrapper[4760]: I0930 07:45:49.113229 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:45:49 crc kubenswrapper[4760]: I0930 07:45:49.113797 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:45:58 crc kubenswrapper[4760]: I0930 07:45:58.691765 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7db464cf7c-k5lfr"] Sep 30 07:45:58 crc kubenswrapper[4760]: E0930 07:45:58.692373 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0b17021-6ad1-473c-ba06-7d4ba8eb162a" containerName="extract" Sep 30 07:45:58 crc kubenswrapper[4760]: I0930 07:45:58.692385 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0b17021-6ad1-473c-ba06-7d4ba8eb162a" containerName="extract" Sep 30 07:45:58 crc kubenswrapper[4760]: E0930 07:45:58.692398 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0b17021-6ad1-473c-ba06-7d4ba8eb162a" containerName="pull" Sep 30 07:45:58 crc kubenswrapper[4760]: I0930 07:45:58.692404 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0b17021-6ad1-473c-ba06-7d4ba8eb162a" containerName="pull" Sep 30 07:45:58 crc kubenswrapper[4760]: E0930 07:45:58.692417 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0b17021-6ad1-473c-ba06-7d4ba8eb162a" containerName="util" Sep 30 07:45:58 crc kubenswrapper[4760]: I0930 07:45:58.692423 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0b17021-6ad1-473c-ba06-7d4ba8eb162a" containerName="util" Sep 30 07:45:58 crc kubenswrapper[4760]: I0930 07:45:58.692523 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0b17021-6ad1-473c-ba06-7d4ba8eb162a" containerName="extract" Sep 30 07:45:58 crc kubenswrapper[4760]: I0930 07:45:58.692870 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7db464cf7c-k5lfr" Sep 30 07:45:58 crc kubenswrapper[4760]: I0930 07:45:58.696459 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-gv4mr" Sep 30 07:45:58 crc kubenswrapper[4760]: I0930 07:45:58.696861 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Sep 30 07:45:58 crc kubenswrapper[4760]: I0930 07:45:58.697072 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Sep 30 07:45:58 crc kubenswrapper[4760]: I0930 07:45:58.697483 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Sep 30 07:45:58 crc kubenswrapper[4760]: I0930 07:45:58.697713 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Sep 30 07:45:58 crc kubenswrapper[4760]: I0930 07:45:58.723089 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7db464cf7c-k5lfr"] Sep 30 07:45:58 crc kubenswrapper[4760]: I0930 07:45:58.759243 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/900aa033-c62f-42f8-a964-9d0e113eca21-webhook-cert\") pod \"metallb-operator-controller-manager-7db464cf7c-k5lfr\" (UID: \"900aa033-c62f-42f8-a964-9d0e113eca21\") " pod="metallb-system/metallb-operator-controller-manager-7db464cf7c-k5lfr" Sep 30 07:45:58 crc kubenswrapper[4760]: I0930 07:45:58.759339 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f8jl\" (UniqueName: \"kubernetes.io/projected/900aa033-c62f-42f8-a964-9d0e113eca21-kube-api-access-9f8jl\") pod \"metallb-operator-controller-manager-7db464cf7c-k5lfr\" (UID: \"900aa033-c62f-42f8-a964-9d0e113eca21\") " pod="metallb-system/metallb-operator-controller-manager-7db464cf7c-k5lfr" Sep 30 07:45:58 crc kubenswrapper[4760]: I0930 07:45:58.759402 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/900aa033-c62f-42f8-a964-9d0e113eca21-apiservice-cert\") pod \"metallb-operator-controller-manager-7db464cf7c-k5lfr\" (UID: \"900aa033-c62f-42f8-a964-9d0e113eca21\") " pod="metallb-system/metallb-operator-controller-manager-7db464cf7c-k5lfr" Sep 30 07:45:58 crc kubenswrapper[4760]: I0930 07:45:58.860147 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/900aa033-c62f-42f8-a964-9d0e113eca21-webhook-cert\") pod \"metallb-operator-controller-manager-7db464cf7c-k5lfr\" (UID: \"900aa033-c62f-42f8-a964-9d0e113eca21\") " pod="metallb-system/metallb-operator-controller-manager-7db464cf7c-k5lfr" Sep 30 07:45:58 crc kubenswrapper[4760]: I0930 07:45:58.860220 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f8jl\" (UniqueName: \"kubernetes.io/projected/900aa033-c62f-42f8-a964-9d0e113eca21-kube-api-access-9f8jl\") pod \"metallb-operator-controller-manager-7db464cf7c-k5lfr\" (UID: \"900aa033-c62f-42f8-a964-9d0e113eca21\") " pod="metallb-system/metallb-operator-controller-manager-7db464cf7c-k5lfr" Sep 30 07:45:58 crc kubenswrapper[4760]: I0930 07:45:58.860255 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/900aa033-c62f-42f8-a964-9d0e113eca21-apiservice-cert\") pod \"metallb-operator-controller-manager-7db464cf7c-k5lfr\" (UID: \"900aa033-c62f-42f8-a964-9d0e113eca21\") " pod="metallb-system/metallb-operator-controller-manager-7db464cf7c-k5lfr" Sep 30 07:45:58 crc kubenswrapper[4760]: I0930 07:45:58.865912 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/900aa033-c62f-42f8-a964-9d0e113eca21-webhook-cert\") pod \"metallb-operator-controller-manager-7db464cf7c-k5lfr\" (UID: \"900aa033-c62f-42f8-a964-9d0e113eca21\") " pod="metallb-system/metallb-operator-controller-manager-7db464cf7c-k5lfr" Sep 30 07:45:58 crc kubenswrapper[4760]: I0930 07:45:58.875695 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f8jl\" (UniqueName: \"kubernetes.io/projected/900aa033-c62f-42f8-a964-9d0e113eca21-kube-api-access-9f8jl\") pod \"metallb-operator-controller-manager-7db464cf7c-k5lfr\" (UID: \"900aa033-c62f-42f8-a964-9d0e113eca21\") " pod="metallb-system/metallb-operator-controller-manager-7db464cf7c-k5lfr" Sep 30 07:45:58 crc kubenswrapper[4760]: I0930 07:45:58.881111 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/900aa033-c62f-42f8-a964-9d0e113eca21-apiservice-cert\") pod \"metallb-operator-controller-manager-7db464cf7c-k5lfr\" (UID: \"900aa033-c62f-42f8-a964-9d0e113eca21\") " pod="metallb-system/metallb-operator-controller-manager-7db464cf7c-k5lfr" Sep 30 07:45:59 crc kubenswrapper[4760]: I0930 07:45:59.017823 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7db464cf7c-k5lfr" Sep 30 07:45:59 crc kubenswrapper[4760]: I0930 07:45:59.046280 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-54c4f7bf85-ndtrq"] Sep 30 07:45:59 crc kubenswrapper[4760]: I0930 07:45:59.058590 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-54c4f7bf85-ndtrq" Sep 30 07:45:59 crc kubenswrapper[4760]: I0930 07:45:59.062493 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 30 07:45:59 crc kubenswrapper[4760]: I0930 07:45:59.062751 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Sep 30 07:45:59 crc kubenswrapper[4760]: I0930 07:45:59.063045 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-54c4f7bf85-ndtrq"] Sep 30 07:45:59 crc kubenswrapper[4760]: I0930 07:45:59.067435 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-8rlmh" Sep 30 07:45:59 crc kubenswrapper[4760]: I0930 07:45:59.163790 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/be9587c7-9bbb-48ad-867a-1830129f24b3-webhook-cert\") pod \"metallb-operator-webhook-server-54c4f7bf85-ndtrq\" (UID: \"be9587c7-9bbb-48ad-867a-1830129f24b3\") " pod="metallb-system/metallb-operator-webhook-server-54c4f7bf85-ndtrq" Sep 30 07:45:59 crc kubenswrapper[4760]: I0930 07:45:59.163990 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/be9587c7-9bbb-48ad-867a-1830129f24b3-apiservice-cert\") pod \"metallb-operator-webhook-server-54c4f7bf85-ndtrq\" (UID: \"be9587c7-9bbb-48ad-867a-1830129f24b3\") " pod="metallb-system/metallb-operator-webhook-server-54c4f7bf85-ndtrq" Sep 30 07:45:59 crc kubenswrapper[4760]: I0930 07:45:59.164255 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd66w\" (UniqueName: \"kubernetes.io/projected/be9587c7-9bbb-48ad-867a-1830129f24b3-kube-api-access-rd66w\") pod \"metallb-operator-webhook-server-54c4f7bf85-ndtrq\" (UID: \"be9587c7-9bbb-48ad-867a-1830129f24b3\") " pod="metallb-system/metallb-operator-webhook-server-54c4f7bf85-ndtrq" Sep 30 07:45:59 crc kubenswrapper[4760]: I0930 07:45:59.265591 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/be9587c7-9bbb-48ad-867a-1830129f24b3-webhook-cert\") pod \"metallb-operator-webhook-server-54c4f7bf85-ndtrq\" (UID: \"be9587c7-9bbb-48ad-867a-1830129f24b3\") " pod="metallb-system/metallb-operator-webhook-server-54c4f7bf85-ndtrq" Sep 30 07:45:59 crc kubenswrapper[4760]: I0930 07:45:59.265639 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/be9587c7-9bbb-48ad-867a-1830129f24b3-apiservice-cert\") pod \"metallb-operator-webhook-server-54c4f7bf85-ndtrq\" (UID: \"be9587c7-9bbb-48ad-867a-1830129f24b3\") " pod="metallb-system/metallb-operator-webhook-server-54c4f7bf85-ndtrq" Sep 30 07:45:59 crc kubenswrapper[4760]: I0930 07:45:59.265702 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd66w\" (UniqueName: \"kubernetes.io/projected/be9587c7-9bbb-48ad-867a-1830129f24b3-kube-api-access-rd66w\") pod \"metallb-operator-webhook-server-54c4f7bf85-ndtrq\" (UID: \"be9587c7-9bbb-48ad-867a-1830129f24b3\") " pod="metallb-system/metallb-operator-webhook-server-54c4f7bf85-ndtrq" Sep 30 07:45:59 crc kubenswrapper[4760]: I0930 07:45:59.269992 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/be9587c7-9bbb-48ad-867a-1830129f24b3-webhook-cert\") pod \"metallb-operator-webhook-server-54c4f7bf85-ndtrq\" (UID: \"be9587c7-9bbb-48ad-867a-1830129f24b3\") " pod="metallb-system/metallb-operator-webhook-server-54c4f7bf85-ndtrq" Sep 30 07:45:59 crc kubenswrapper[4760]: I0930 07:45:59.270144 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/be9587c7-9bbb-48ad-867a-1830129f24b3-apiservice-cert\") pod \"metallb-operator-webhook-server-54c4f7bf85-ndtrq\" (UID: \"be9587c7-9bbb-48ad-867a-1830129f24b3\") " pod="metallb-system/metallb-operator-webhook-server-54c4f7bf85-ndtrq" Sep 30 07:45:59 crc kubenswrapper[4760]: I0930 07:45:59.281078 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd66w\" (UniqueName: \"kubernetes.io/projected/be9587c7-9bbb-48ad-867a-1830129f24b3-kube-api-access-rd66w\") pod \"metallb-operator-webhook-server-54c4f7bf85-ndtrq\" (UID: \"be9587c7-9bbb-48ad-867a-1830129f24b3\") " pod="metallb-system/metallb-operator-webhook-server-54c4f7bf85-ndtrq" Sep 30 07:45:59 crc kubenswrapper[4760]: I0930 07:45:59.407051 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-54c4f7bf85-ndtrq" Sep 30 07:45:59 crc kubenswrapper[4760]: I0930 07:45:59.509295 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7db464cf7c-k5lfr"] Sep 30 07:45:59 crc kubenswrapper[4760]: W0930 07:45:59.526605 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod900aa033_c62f_42f8_a964_9d0e113eca21.slice/crio-6b00bcda246a30dcdd2fb43c092b95873b347004c51412b70558093a5cc2371c WatchSource:0}: Error finding container 6b00bcda246a30dcdd2fb43c092b95873b347004c51412b70558093a5cc2371c: Status 404 returned error can't find the container with id 6b00bcda246a30dcdd2fb43c092b95873b347004c51412b70558093a5cc2371c Sep 30 07:45:59 crc kubenswrapper[4760]: I0930 07:45:59.618956 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-54c4f7bf85-ndtrq"] Sep 30 07:45:59 crc kubenswrapper[4760]: I0930 07:45:59.664775 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-54c4f7bf85-ndtrq" event={"ID":"be9587c7-9bbb-48ad-867a-1830129f24b3","Type":"ContainerStarted","Data":"1c56807b71037b6e0f0596865dbfa93fc0556cf233a63dbf709ec9e073387b21"} Sep 30 07:45:59 crc kubenswrapper[4760]: I0930 07:45:59.666148 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7db464cf7c-k5lfr" event={"ID":"900aa033-c62f-42f8-a964-9d0e113eca21","Type":"ContainerStarted","Data":"6b00bcda246a30dcdd2fb43c092b95873b347004c51412b70558093a5cc2371c"} Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.015150 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6cvdd"] Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.015992 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" podUID="fa5b63b3-2bc6-496c-8841-471e2f43021c" containerName="controller-manager" containerID="cri-o://acc2e1c979d3ac62e71f41efccb9e82ae2a4395ac33f98d2ef759d67ef452324" gracePeriod=30 Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.084697 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7"] Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.084949 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7" podUID="ff7ef449-b6c7-4e55-886d-b66dd8327e7b" containerName="route-controller-manager" containerID="cri-o://cc747737e43584279625d25fc7dc963aeac706b61e096c03ccadbd90231adbad" gracePeriod=30 Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.486909 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.595672 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7" Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.601189 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa5b63b3-2bc6-496c-8841-471e2f43021c-client-ca\") pod \"fa5b63b3-2bc6-496c-8841-471e2f43021c\" (UID: \"fa5b63b3-2bc6-496c-8841-471e2f43021c\") " Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.601335 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djkdl\" (UniqueName: \"kubernetes.io/projected/fa5b63b3-2bc6-496c-8841-471e2f43021c-kube-api-access-djkdl\") pod \"fa5b63b3-2bc6-496c-8841-471e2f43021c\" (UID: \"fa5b63b3-2bc6-496c-8841-471e2f43021c\") " Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.601416 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa5b63b3-2bc6-496c-8841-471e2f43021c-serving-cert\") pod \"fa5b63b3-2bc6-496c-8841-471e2f43021c\" (UID: \"fa5b63b3-2bc6-496c-8841-471e2f43021c\") " Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.601508 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa5b63b3-2bc6-496c-8841-471e2f43021c-config\") pod \"fa5b63b3-2bc6-496c-8841-471e2f43021c\" (UID: \"fa5b63b3-2bc6-496c-8841-471e2f43021c\") " Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.601546 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fa5b63b3-2bc6-496c-8841-471e2f43021c-proxy-ca-bundles\") pod \"fa5b63b3-2bc6-496c-8841-471e2f43021c\" (UID: \"fa5b63b3-2bc6-496c-8841-471e2f43021c\") " Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.602200 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa5b63b3-2bc6-496c-8841-471e2f43021c-client-ca" (OuterVolumeSpecName: "client-ca") pod "fa5b63b3-2bc6-496c-8841-471e2f43021c" (UID: "fa5b63b3-2bc6-496c-8841-471e2f43021c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.602357 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa5b63b3-2bc6-496c-8841-471e2f43021c-config" (OuterVolumeSpecName: "config") pod "fa5b63b3-2bc6-496c-8841-471e2f43021c" (UID: "fa5b63b3-2bc6-496c-8841-471e2f43021c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.602567 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa5b63b3-2bc6-496c-8841-471e2f43021c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "fa5b63b3-2bc6-496c-8841-471e2f43021c" (UID: "fa5b63b3-2bc6-496c-8841-471e2f43021c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.602678 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa5b63b3-2bc6-496c-8841-471e2f43021c-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.602703 4760 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fa5b63b3-2bc6-496c-8841-471e2f43021c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.602715 4760 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa5b63b3-2bc6-496c-8841-471e2f43021c-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.609374 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa5b63b3-2bc6-496c-8841-471e2f43021c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fa5b63b3-2bc6-496c-8841-471e2f43021c" (UID: "fa5b63b3-2bc6-496c-8841-471e2f43021c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.609635 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa5b63b3-2bc6-496c-8841-471e2f43021c-kube-api-access-djkdl" (OuterVolumeSpecName: "kube-api-access-djkdl") pod "fa5b63b3-2bc6-496c-8841-471e2f43021c" (UID: "fa5b63b3-2bc6-496c-8841-471e2f43021c"). InnerVolumeSpecName "kube-api-access-djkdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.694934 4760 generic.go:334] "Generic (PLEG): container finished" podID="ff7ef449-b6c7-4e55-886d-b66dd8327e7b" containerID="cc747737e43584279625d25fc7dc963aeac706b61e096c03ccadbd90231adbad" exitCode=0 Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.695054 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7" event={"ID":"ff7ef449-b6c7-4e55-886d-b66dd8327e7b","Type":"ContainerDied","Data":"cc747737e43584279625d25fc7dc963aeac706b61e096c03ccadbd90231adbad"} Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.695091 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7" event={"ID":"ff7ef449-b6c7-4e55-886d-b66dd8327e7b","Type":"ContainerDied","Data":"f861ceb180324f3be1769729c36af6fbc0e81f22726b4f24d612254489bc51d0"} Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.695112 4760 scope.go:117] "RemoveContainer" containerID="cc747737e43584279625d25fc7dc963aeac706b61e096c03ccadbd90231adbad" Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.695276 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7" Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.697072 4760 generic.go:334] "Generic (PLEG): container finished" podID="fa5b63b3-2bc6-496c-8841-471e2f43021c" containerID="acc2e1c979d3ac62e71f41efccb9e82ae2a4395ac33f98d2ef759d67ef452324" exitCode=0 Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.697130 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" event={"ID":"fa5b63b3-2bc6-496c-8841-471e2f43021c","Type":"ContainerDied","Data":"acc2e1c979d3ac62e71f41efccb9e82ae2a4395ac33f98d2ef759d67ef452324"} Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.697158 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" event={"ID":"fa5b63b3-2bc6-496c-8841-471e2f43021c","Type":"ContainerDied","Data":"f431cc1791f296a50a8a526b7bf95cb78b3ebb7c315c533a717ccd1e25ba54e4"} Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.697237 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6cvdd" Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.703479 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff7ef449-b6c7-4e55-886d-b66dd8327e7b-serving-cert\") pod \"ff7ef449-b6c7-4e55-886d-b66dd8327e7b\" (UID: \"ff7ef449-b6c7-4e55-886d-b66dd8327e7b\") " Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.703582 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff7ef449-b6c7-4e55-886d-b66dd8327e7b-client-ca\") pod \"ff7ef449-b6c7-4e55-886d-b66dd8327e7b\" (UID: \"ff7ef449-b6c7-4e55-886d-b66dd8327e7b\") " Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.703642 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n62bz\" (UniqueName: \"kubernetes.io/projected/ff7ef449-b6c7-4e55-886d-b66dd8327e7b-kube-api-access-n62bz\") pod \"ff7ef449-b6c7-4e55-886d-b66dd8327e7b\" (UID: \"ff7ef449-b6c7-4e55-886d-b66dd8327e7b\") " Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.703694 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff7ef449-b6c7-4e55-886d-b66dd8327e7b-config\") pod \"ff7ef449-b6c7-4e55-886d-b66dd8327e7b\" (UID: \"ff7ef449-b6c7-4e55-886d-b66dd8327e7b\") " Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.703931 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa5b63b3-2bc6-496c-8841-471e2f43021c-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.703950 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djkdl\" (UniqueName: \"kubernetes.io/projected/fa5b63b3-2bc6-496c-8841-471e2f43021c-kube-api-access-djkdl\") on node \"crc\" DevicePath \"\"" Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.704760 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff7ef449-b6c7-4e55-886d-b66dd8327e7b-client-ca" (OuterVolumeSpecName: "client-ca") pod "ff7ef449-b6c7-4e55-886d-b66dd8327e7b" (UID: "ff7ef449-b6c7-4e55-886d-b66dd8327e7b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.704777 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff7ef449-b6c7-4e55-886d-b66dd8327e7b-config" (OuterVolumeSpecName: "config") pod "ff7ef449-b6c7-4e55-886d-b66dd8327e7b" (UID: "ff7ef449-b6c7-4e55-886d-b66dd8327e7b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.707585 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff7ef449-b6c7-4e55-886d-b66dd8327e7b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ff7ef449-b6c7-4e55-886d-b66dd8327e7b" (UID: "ff7ef449-b6c7-4e55-886d-b66dd8327e7b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.708047 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff7ef449-b6c7-4e55-886d-b66dd8327e7b-kube-api-access-n62bz" (OuterVolumeSpecName: "kube-api-access-n62bz") pod "ff7ef449-b6c7-4e55-886d-b66dd8327e7b" (UID: "ff7ef449-b6c7-4e55-886d-b66dd8327e7b"). InnerVolumeSpecName "kube-api-access-n62bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.718223 4760 scope.go:117] "RemoveContainer" containerID="cc747737e43584279625d25fc7dc963aeac706b61e096c03ccadbd90231adbad" Sep 30 07:46:01 crc kubenswrapper[4760]: E0930 07:46:01.719862 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc747737e43584279625d25fc7dc963aeac706b61e096c03ccadbd90231adbad\": container with ID starting with cc747737e43584279625d25fc7dc963aeac706b61e096c03ccadbd90231adbad not found: ID does not exist" containerID="cc747737e43584279625d25fc7dc963aeac706b61e096c03ccadbd90231adbad" Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.719909 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc747737e43584279625d25fc7dc963aeac706b61e096c03ccadbd90231adbad"} err="failed to get container status \"cc747737e43584279625d25fc7dc963aeac706b61e096c03ccadbd90231adbad\": rpc error: code = NotFound desc = could not find container \"cc747737e43584279625d25fc7dc963aeac706b61e096c03ccadbd90231adbad\": container with ID starting with cc747737e43584279625d25fc7dc963aeac706b61e096c03ccadbd90231adbad not found: ID does not exist" Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.719937 4760 scope.go:117] "RemoveContainer" containerID="acc2e1c979d3ac62e71f41efccb9e82ae2a4395ac33f98d2ef759d67ef452324" Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.727416 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6cvdd"] Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.728613 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6cvdd"] Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.735806 4760 scope.go:117] "RemoveContainer" containerID="acc2e1c979d3ac62e71f41efccb9e82ae2a4395ac33f98d2ef759d67ef452324" Sep 30 07:46:01 crc kubenswrapper[4760]: E0930 07:46:01.736105 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acc2e1c979d3ac62e71f41efccb9e82ae2a4395ac33f98d2ef759d67ef452324\": container with ID starting with acc2e1c979d3ac62e71f41efccb9e82ae2a4395ac33f98d2ef759d67ef452324 not found: ID does not exist" containerID="acc2e1c979d3ac62e71f41efccb9e82ae2a4395ac33f98d2ef759d67ef452324" Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.736155 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acc2e1c979d3ac62e71f41efccb9e82ae2a4395ac33f98d2ef759d67ef452324"} err="failed to get container status \"acc2e1c979d3ac62e71f41efccb9e82ae2a4395ac33f98d2ef759d67ef452324\": rpc error: code = NotFound desc = could not find container \"acc2e1c979d3ac62e71f41efccb9e82ae2a4395ac33f98d2ef759d67ef452324\": container with ID starting with acc2e1c979d3ac62e71f41efccb9e82ae2a4395ac33f98d2ef759d67ef452324 not found: ID does not exist" Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.805762 4760 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff7ef449-b6c7-4e55-886d-b66dd8327e7b-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.805793 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n62bz\" (UniqueName: \"kubernetes.io/projected/ff7ef449-b6c7-4e55-886d-b66dd8327e7b-kube-api-access-n62bz\") on node \"crc\" DevicePath \"\"" Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.805808 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff7ef449-b6c7-4e55-886d-b66dd8327e7b-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:46:01 crc kubenswrapper[4760]: I0930 07:46:01.805821 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff7ef449-b6c7-4e55-886d-b66dd8327e7b-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.047827 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7"] Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.058349 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m8vl7"] Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.684401 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8685dbd756-g2bzs"] Sep 30 07:46:02 crc kubenswrapper[4760]: E0930 07:46:02.684676 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa5b63b3-2bc6-496c-8841-471e2f43021c" containerName="controller-manager" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.684691 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa5b63b3-2bc6-496c-8841-471e2f43021c" containerName="controller-manager" Sep 30 07:46:02 crc kubenswrapper[4760]: E0930 07:46:02.684710 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff7ef449-b6c7-4e55-886d-b66dd8327e7b" containerName="route-controller-manager" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.684719 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff7ef449-b6c7-4e55-886d-b66dd8327e7b" containerName="route-controller-manager" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.684860 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa5b63b3-2bc6-496c-8841-471e2f43021c" containerName="controller-manager" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.684881 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff7ef449-b6c7-4e55-886d-b66dd8327e7b" containerName="route-controller-manager" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.685422 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8685dbd756-g2bzs" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.689974 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.690006 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.690152 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.690199 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.690273 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.691534 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.699467 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6844f7d76b-xkkxh"] Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.701379 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6844f7d76b-xkkxh" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.703153 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.703766 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.704001 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.704537 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.704743 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.707995 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.708463 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6844f7d76b-xkkxh"] Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.710518 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.715093 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8685dbd756-g2bzs"] Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.818635 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c91a114c-342a-4e26-95d9-aa1e021afa8d-config\") pod \"controller-manager-6844f7d76b-xkkxh\" (UID: \"c91a114c-342a-4e26-95d9-aa1e021afa8d\") " pod="openshift-controller-manager/controller-manager-6844f7d76b-xkkxh" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.819009 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a06a5e4c-e117-4ad2-b74b-09a7e5f832fc-serving-cert\") pod \"route-controller-manager-8685dbd756-g2bzs\" (UID: \"a06a5e4c-e117-4ad2-b74b-09a7e5f832fc\") " pod="openshift-route-controller-manager/route-controller-manager-8685dbd756-g2bzs" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.819037 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c91a114c-342a-4e26-95d9-aa1e021afa8d-proxy-ca-bundles\") pod \"controller-manager-6844f7d76b-xkkxh\" (UID: \"c91a114c-342a-4e26-95d9-aa1e021afa8d\") " pod="openshift-controller-manager/controller-manager-6844f7d76b-xkkxh" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.819091 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mc9h\" (UniqueName: \"kubernetes.io/projected/c91a114c-342a-4e26-95d9-aa1e021afa8d-kube-api-access-5mc9h\") pod \"controller-manager-6844f7d76b-xkkxh\" (UID: \"c91a114c-342a-4e26-95d9-aa1e021afa8d\") " pod="openshift-controller-manager/controller-manager-6844f7d76b-xkkxh" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.819116 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a06a5e4c-e117-4ad2-b74b-09a7e5f832fc-config\") pod \"route-controller-manager-8685dbd756-g2bzs\" (UID: \"a06a5e4c-e117-4ad2-b74b-09a7e5f832fc\") " pod="openshift-route-controller-manager/route-controller-manager-8685dbd756-g2bzs" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.819161 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c91a114c-342a-4e26-95d9-aa1e021afa8d-client-ca\") pod \"controller-manager-6844f7d76b-xkkxh\" (UID: \"c91a114c-342a-4e26-95d9-aa1e021afa8d\") " pod="openshift-controller-manager/controller-manager-6844f7d76b-xkkxh" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.819181 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a06a5e4c-e117-4ad2-b74b-09a7e5f832fc-client-ca\") pod \"route-controller-manager-8685dbd756-g2bzs\" (UID: \"a06a5e4c-e117-4ad2-b74b-09a7e5f832fc\") " pod="openshift-route-controller-manager/route-controller-manager-8685dbd756-g2bzs" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.819210 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmfb8\" (UniqueName: \"kubernetes.io/projected/a06a5e4c-e117-4ad2-b74b-09a7e5f832fc-kube-api-access-nmfb8\") pod \"route-controller-manager-8685dbd756-g2bzs\" (UID: \"a06a5e4c-e117-4ad2-b74b-09a7e5f832fc\") " pod="openshift-route-controller-manager/route-controller-manager-8685dbd756-g2bzs" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.819235 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c91a114c-342a-4e26-95d9-aa1e021afa8d-serving-cert\") pod \"controller-manager-6844f7d76b-xkkxh\" (UID: \"c91a114c-342a-4e26-95d9-aa1e021afa8d\") " pod="openshift-controller-manager/controller-manager-6844f7d76b-xkkxh" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.919845 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c91a114c-342a-4e26-95d9-aa1e021afa8d-config\") pod \"controller-manager-6844f7d76b-xkkxh\" (UID: \"c91a114c-342a-4e26-95d9-aa1e021afa8d\") " pod="openshift-controller-manager/controller-manager-6844f7d76b-xkkxh" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.919894 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a06a5e4c-e117-4ad2-b74b-09a7e5f832fc-serving-cert\") pod \"route-controller-manager-8685dbd756-g2bzs\" (UID: \"a06a5e4c-e117-4ad2-b74b-09a7e5f832fc\") " pod="openshift-route-controller-manager/route-controller-manager-8685dbd756-g2bzs" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.919913 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c91a114c-342a-4e26-95d9-aa1e021afa8d-proxy-ca-bundles\") pod \"controller-manager-6844f7d76b-xkkxh\" (UID: \"c91a114c-342a-4e26-95d9-aa1e021afa8d\") " pod="openshift-controller-manager/controller-manager-6844f7d76b-xkkxh" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.919953 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mc9h\" (UniqueName: \"kubernetes.io/projected/c91a114c-342a-4e26-95d9-aa1e021afa8d-kube-api-access-5mc9h\") pod \"controller-manager-6844f7d76b-xkkxh\" (UID: \"c91a114c-342a-4e26-95d9-aa1e021afa8d\") " pod="openshift-controller-manager/controller-manager-6844f7d76b-xkkxh" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.919975 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a06a5e4c-e117-4ad2-b74b-09a7e5f832fc-config\") pod \"route-controller-manager-8685dbd756-g2bzs\" (UID: \"a06a5e4c-e117-4ad2-b74b-09a7e5f832fc\") " pod="openshift-route-controller-manager/route-controller-manager-8685dbd756-g2bzs" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.920011 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c91a114c-342a-4e26-95d9-aa1e021afa8d-client-ca\") pod \"controller-manager-6844f7d76b-xkkxh\" (UID: \"c91a114c-342a-4e26-95d9-aa1e021afa8d\") " pod="openshift-controller-manager/controller-manager-6844f7d76b-xkkxh" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.920030 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a06a5e4c-e117-4ad2-b74b-09a7e5f832fc-client-ca\") pod \"route-controller-manager-8685dbd756-g2bzs\" (UID: \"a06a5e4c-e117-4ad2-b74b-09a7e5f832fc\") " pod="openshift-route-controller-manager/route-controller-manager-8685dbd756-g2bzs" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.920048 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmfb8\" (UniqueName: \"kubernetes.io/projected/a06a5e4c-e117-4ad2-b74b-09a7e5f832fc-kube-api-access-nmfb8\") pod \"route-controller-manager-8685dbd756-g2bzs\" (UID: \"a06a5e4c-e117-4ad2-b74b-09a7e5f832fc\") " pod="openshift-route-controller-manager/route-controller-manager-8685dbd756-g2bzs" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.920064 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c91a114c-342a-4e26-95d9-aa1e021afa8d-serving-cert\") pod \"controller-manager-6844f7d76b-xkkxh\" (UID: \"c91a114c-342a-4e26-95d9-aa1e021afa8d\") " pod="openshift-controller-manager/controller-manager-6844f7d76b-xkkxh" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.922813 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c91a114c-342a-4e26-95d9-aa1e021afa8d-config\") pod \"controller-manager-6844f7d76b-xkkxh\" (UID: \"c91a114c-342a-4e26-95d9-aa1e021afa8d\") " pod="openshift-controller-manager/controller-manager-6844f7d76b-xkkxh" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.926145 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c91a114c-342a-4e26-95d9-aa1e021afa8d-client-ca\") pod \"controller-manager-6844f7d76b-xkkxh\" (UID: \"c91a114c-342a-4e26-95d9-aa1e021afa8d\") " pod="openshift-controller-manager/controller-manager-6844f7d76b-xkkxh" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.926976 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a06a5e4c-e117-4ad2-b74b-09a7e5f832fc-config\") pod \"route-controller-manager-8685dbd756-g2bzs\" (UID: \"a06a5e4c-e117-4ad2-b74b-09a7e5f832fc\") " pod="openshift-route-controller-manager/route-controller-manager-8685dbd756-g2bzs" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.927503 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a06a5e4c-e117-4ad2-b74b-09a7e5f832fc-client-ca\") pod \"route-controller-manager-8685dbd756-g2bzs\" (UID: \"a06a5e4c-e117-4ad2-b74b-09a7e5f832fc\") " pod="openshift-route-controller-manager/route-controller-manager-8685dbd756-g2bzs" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.930459 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c91a114c-342a-4e26-95d9-aa1e021afa8d-serving-cert\") pod \"controller-manager-6844f7d76b-xkkxh\" (UID: \"c91a114c-342a-4e26-95d9-aa1e021afa8d\") " pod="openshift-controller-manager/controller-manager-6844f7d76b-xkkxh" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.931985 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c91a114c-342a-4e26-95d9-aa1e021afa8d-proxy-ca-bundles\") pod \"controller-manager-6844f7d76b-xkkxh\" (UID: \"c91a114c-342a-4e26-95d9-aa1e021afa8d\") " pod="openshift-controller-manager/controller-manager-6844f7d76b-xkkxh" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.932263 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a06a5e4c-e117-4ad2-b74b-09a7e5f832fc-serving-cert\") pod \"route-controller-manager-8685dbd756-g2bzs\" (UID: \"a06a5e4c-e117-4ad2-b74b-09a7e5f832fc\") " pod="openshift-route-controller-manager/route-controller-manager-8685dbd756-g2bzs" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.965651 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmfb8\" (UniqueName: \"kubernetes.io/projected/a06a5e4c-e117-4ad2-b74b-09a7e5f832fc-kube-api-access-nmfb8\") pod \"route-controller-manager-8685dbd756-g2bzs\" (UID: \"a06a5e4c-e117-4ad2-b74b-09a7e5f832fc\") " pod="openshift-route-controller-manager/route-controller-manager-8685dbd756-g2bzs" Sep 30 07:46:02 crc kubenswrapper[4760]: I0930 07:46:02.971263 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mc9h\" (UniqueName: \"kubernetes.io/projected/c91a114c-342a-4e26-95d9-aa1e021afa8d-kube-api-access-5mc9h\") pod \"controller-manager-6844f7d76b-xkkxh\" (UID: \"c91a114c-342a-4e26-95d9-aa1e021afa8d\") " pod="openshift-controller-manager/controller-manager-6844f7d76b-xkkxh" Sep 30 07:46:03 crc kubenswrapper[4760]: I0930 07:46:03.034800 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8685dbd756-g2bzs" Sep 30 07:46:03 crc kubenswrapper[4760]: I0930 07:46:03.048752 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6844f7d76b-xkkxh" Sep 30 07:46:03 crc kubenswrapper[4760]: I0930 07:46:03.074198 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa5b63b3-2bc6-496c-8841-471e2f43021c" path="/var/lib/kubelet/pods/fa5b63b3-2bc6-496c-8841-471e2f43021c/volumes" Sep 30 07:46:03 crc kubenswrapper[4760]: I0930 07:46:03.074858 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff7ef449-b6c7-4e55-886d-b66dd8327e7b" path="/var/lib/kubelet/pods/ff7ef449-b6c7-4e55-886d-b66dd8327e7b/volumes" Sep 30 07:46:03 crc kubenswrapper[4760]: I0930 07:46:03.889034 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6844f7d76b-xkkxh"] Sep 30 07:46:03 crc kubenswrapper[4760]: I0930 07:46:03.895546 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8685dbd756-g2bzs"] Sep 30 07:46:05 crc kubenswrapper[4760]: I0930 07:46:05.726216 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-54c4f7bf85-ndtrq" event={"ID":"be9587c7-9bbb-48ad-867a-1830129f24b3","Type":"ContainerStarted","Data":"da26b12baf306d6f698bc21b444d8c70ccece9e1bdfe3e66b7b816767abade28"} Sep 30 07:46:05 crc kubenswrapper[4760]: I0930 07:46:05.726632 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-54c4f7bf85-ndtrq" Sep 30 07:46:05 crc kubenswrapper[4760]: I0930 07:46:05.728453 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6844f7d76b-xkkxh" event={"ID":"c91a114c-342a-4e26-95d9-aa1e021afa8d","Type":"ContainerStarted","Data":"07250dbccb0aa95f12165b21b0ca100d0616d1072ed9cbcf5de72967652609b8"} Sep 30 07:46:05 crc kubenswrapper[4760]: I0930 07:46:05.728490 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6844f7d76b-xkkxh" event={"ID":"c91a114c-342a-4e26-95d9-aa1e021afa8d","Type":"ContainerStarted","Data":"63072d126b63c66ae4d60b1ac168eb03714599fae3523d331c1591b882319cc7"} Sep 30 07:46:05 crc kubenswrapper[4760]: I0930 07:46:05.728638 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6844f7d76b-xkkxh" Sep 30 07:46:05 crc kubenswrapper[4760]: I0930 07:46:05.730284 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7db464cf7c-k5lfr" event={"ID":"900aa033-c62f-42f8-a964-9d0e113eca21","Type":"ContainerStarted","Data":"67084c0b8fdd0bd02e7b96e002847536ce4c301de1539707201fba73fb076076"} Sep 30 07:46:05 crc kubenswrapper[4760]: I0930 07:46:05.730430 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7db464cf7c-k5lfr" Sep 30 07:46:05 crc kubenswrapper[4760]: I0930 07:46:05.731770 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8685dbd756-g2bzs" event={"ID":"a06a5e4c-e117-4ad2-b74b-09a7e5f832fc","Type":"ContainerStarted","Data":"acfdb48467eb13561167790d3b20ce3730bf32342978b8bf7027b23be72453e1"} Sep 30 07:46:05 crc kubenswrapper[4760]: I0930 07:46:05.731807 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8685dbd756-g2bzs" event={"ID":"a06a5e4c-e117-4ad2-b74b-09a7e5f832fc","Type":"ContainerStarted","Data":"2a898857facde5eb398893dd2a1e30d33baa11f86028090bd09bfa9fcb77901d"} Sep 30 07:46:05 crc kubenswrapper[4760]: I0930 07:46:05.731984 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8685dbd756-g2bzs" Sep 30 07:46:05 crc kubenswrapper[4760]: I0930 07:46:05.735875 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6844f7d76b-xkkxh" Sep 30 07:46:05 crc kubenswrapper[4760]: I0930 07:46:05.753835 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8685dbd756-g2bzs" Sep 30 07:46:05 crc kubenswrapper[4760]: I0930 07:46:05.754135 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-54c4f7bf85-ndtrq" podStartSLOduration=0.999050622 podStartE2EDuration="6.754123925s" podCreationTimestamp="2025-09-30 07:45:59 +0000 UTC" firstStartedPulling="2025-09-30 07:45:59.630044177 +0000 UTC m=+745.272950589" lastFinishedPulling="2025-09-30 07:46:05.38511748 +0000 UTC m=+751.028023892" observedRunningTime="2025-09-30 07:46:05.750518873 +0000 UTC m=+751.393425285" watchObservedRunningTime="2025-09-30 07:46:05.754123925 +0000 UTC m=+751.397030337" Sep 30 07:46:05 crc kubenswrapper[4760]: I0930 07:46:05.825770 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7db464cf7c-k5lfr" podStartSLOduration=2.23817832 podStartE2EDuration="7.825751675s" podCreationTimestamp="2025-09-30 07:45:58 +0000 UTC" firstStartedPulling="2025-09-30 07:45:59.531005657 +0000 UTC m=+745.173912069" lastFinishedPulling="2025-09-30 07:46:05.118578992 +0000 UTC m=+750.761485424" observedRunningTime="2025-09-30 07:46:05.81888072 +0000 UTC m=+751.461787132" watchObservedRunningTime="2025-09-30 07:46:05.825751675 +0000 UTC m=+751.468658087" Sep 30 07:46:05 crc kubenswrapper[4760]: I0930 07:46:05.858061 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6844f7d76b-xkkxh" podStartSLOduration=4.85804571 podStartE2EDuration="4.85804571s" podCreationTimestamp="2025-09-30 07:46:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:46:05.845203572 +0000 UTC m=+751.488109974" watchObservedRunningTime="2025-09-30 07:46:05.85804571 +0000 UTC m=+751.500952122" Sep 30 07:46:05 crc kubenswrapper[4760]: I0930 07:46:05.915873 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8685dbd756-g2bzs" podStartSLOduration=4.915855776 podStartE2EDuration="4.915855776s" podCreationTimestamp="2025-09-30 07:46:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:46:05.913501996 +0000 UTC m=+751.556408408" watchObservedRunningTime="2025-09-30 07:46:05.915855776 +0000 UTC m=+751.558762188" Sep 30 07:46:10 crc kubenswrapper[4760]: I0930 07:46:10.312223 4760 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 07:46:16 crc kubenswrapper[4760]: I0930 07:46:16.668892 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rbb7q"] Sep 30 07:46:16 crc kubenswrapper[4760]: I0930 07:46:16.670807 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rbb7q" Sep 30 07:46:16 crc kubenswrapper[4760]: I0930 07:46:16.680472 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbb7q"] Sep 30 07:46:16 crc kubenswrapper[4760]: I0930 07:46:16.805366 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb37104-b575-43d3-9cc5-84f022a1c02a-utilities\") pod \"redhat-marketplace-rbb7q\" (UID: \"efb37104-b575-43d3-9cc5-84f022a1c02a\") " pod="openshift-marketplace/redhat-marketplace-rbb7q" Sep 30 07:46:16 crc kubenswrapper[4760]: I0930 07:46:16.805419 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5c2d\" (UniqueName: \"kubernetes.io/projected/efb37104-b575-43d3-9cc5-84f022a1c02a-kube-api-access-h5c2d\") pod \"redhat-marketplace-rbb7q\" (UID: \"efb37104-b575-43d3-9cc5-84f022a1c02a\") " pod="openshift-marketplace/redhat-marketplace-rbb7q" Sep 30 07:46:16 crc kubenswrapper[4760]: I0930 07:46:16.805489 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb37104-b575-43d3-9cc5-84f022a1c02a-catalog-content\") pod \"redhat-marketplace-rbb7q\" (UID: \"efb37104-b575-43d3-9cc5-84f022a1c02a\") " pod="openshift-marketplace/redhat-marketplace-rbb7q" Sep 30 07:46:16 crc kubenswrapper[4760]: I0930 07:46:16.906646 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb37104-b575-43d3-9cc5-84f022a1c02a-catalog-content\") pod \"redhat-marketplace-rbb7q\" (UID: \"efb37104-b575-43d3-9cc5-84f022a1c02a\") " pod="openshift-marketplace/redhat-marketplace-rbb7q" Sep 30 07:46:16 crc kubenswrapper[4760]: I0930 07:46:16.906749 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb37104-b575-43d3-9cc5-84f022a1c02a-utilities\") pod \"redhat-marketplace-rbb7q\" (UID: \"efb37104-b575-43d3-9cc5-84f022a1c02a\") " pod="openshift-marketplace/redhat-marketplace-rbb7q" Sep 30 07:46:16 crc kubenswrapper[4760]: I0930 07:46:16.906775 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5c2d\" (UniqueName: \"kubernetes.io/projected/efb37104-b575-43d3-9cc5-84f022a1c02a-kube-api-access-h5c2d\") pod \"redhat-marketplace-rbb7q\" (UID: \"efb37104-b575-43d3-9cc5-84f022a1c02a\") " pod="openshift-marketplace/redhat-marketplace-rbb7q" Sep 30 07:46:16 crc kubenswrapper[4760]: I0930 07:46:16.907714 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb37104-b575-43d3-9cc5-84f022a1c02a-catalog-content\") pod \"redhat-marketplace-rbb7q\" (UID: \"efb37104-b575-43d3-9cc5-84f022a1c02a\") " pod="openshift-marketplace/redhat-marketplace-rbb7q" Sep 30 07:46:16 crc kubenswrapper[4760]: I0930 07:46:16.908032 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb37104-b575-43d3-9cc5-84f022a1c02a-utilities\") pod \"redhat-marketplace-rbb7q\" (UID: \"efb37104-b575-43d3-9cc5-84f022a1c02a\") " pod="openshift-marketplace/redhat-marketplace-rbb7q" Sep 30 07:46:16 crc kubenswrapper[4760]: I0930 07:46:16.928660 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5c2d\" (UniqueName: \"kubernetes.io/projected/efb37104-b575-43d3-9cc5-84f022a1c02a-kube-api-access-h5c2d\") pod \"redhat-marketplace-rbb7q\" (UID: \"efb37104-b575-43d3-9cc5-84f022a1c02a\") " pod="openshift-marketplace/redhat-marketplace-rbb7q" Sep 30 07:46:17 crc kubenswrapper[4760]: I0930 07:46:17.031511 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rbb7q" Sep 30 07:46:17 crc kubenswrapper[4760]: I0930 07:46:17.468031 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbb7q"] Sep 30 07:46:17 crc kubenswrapper[4760]: W0930 07:46:17.475684 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefb37104_b575_43d3_9cc5_84f022a1c02a.slice/crio-006632047ae0f6a0d75f0a3fb70ed5842819cf8d97369f5f41a3c4435335713c WatchSource:0}: Error finding container 006632047ae0f6a0d75f0a3fb70ed5842819cf8d97369f5f41a3c4435335713c: Status 404 returned error can't find the container with id 006632047ae0f6a0d75f0a3fb70ed5842819cf8d97369f5f41a3c4435335713c Sep 30 07:46:17 crc kubenswrapper[4760]: I0930 07:46:17.800290 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbb7q" event={"ID":"efb37104-b575-43d3-9cc5-84f022a1c02a","Type":"ContainerStarted","Data":"006632047ae0f6a0d75f0a3fb70ed5842819cf8d97369f5f41a3c4435335713c"} Sep 30 07:46:18 crc kubenswrapper[4760]: I0930 07:46:18.818591 4760 generic.go:334] "Generic (PLEG): container finished" podID="efb37104-b575-43d3-9cc5-84f022a1c02a" containerID="879daea9d9d10d3f476785f7d3bccfa6c36f2f6461d0271284520843c3a02d05" exitCode=0 Sep 30 07:46:18 crc kubenswrapper[4760]: I0930 07:46:18.818660 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbb7q" event={"ID":"efb37104-b575-43d3-9cc5-84f022a1c02a","Type":"ContainerDied","Data":"879daea9d9d10d3f476785f7d3bccfa6c36f2f6461d0271284520843c3a02d05"} Sep 30 07:46:19 crc kubenswrapper[4760]: I0930 07:46:19.113621 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:46:19 crc kubenswrapper[4760]: I0930 07:46:19.114013 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:46:19 crc kubenswrapper[4760]: I0930 07:46:19.114058 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 07:46:19 crc kubenswrapper[4760]: I0930 07:46:19.114856 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2ee85f916ed74821bb70e759d7116d1ced5e1cd63215791b862f6d48359d7b6c"} pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 07:46:19 crc kubenswrapper[4760]: I0930 07:46:19.114950 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" containerID="cri-o://2ee85f916ed74821bb70e759d7116d1ced5e1cd63215791b862f6d48359d7b6c" gracePeriod=600 Sep 30 07:46:19 crc kubenswrapper[4760]: I0930 07:46:19.412636 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-54c4f7bf85-ndtrq" Sep 30 07:46:19 crc kubenswrapper[4760]: I0930 07:46:19.827994 4760 generic.go:334] "Generic (PLEG): container finished" podID="7a9c8270-6964-4886-87d0-227b05b76da4" containerID="2ee85f916ed74821bb70e759d7116d1ced5e1cd63215791b862f6d48359d7b6c" exitCode=0 Sep 30 07:46:19 crc kubenswrapper[4760]: I0930 07:46:19.828055 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerDied","Data":"2ee85f916ed74821bb70e759d7116d1ced5e1cd63215791b862f6d48359d7b6c"} Sep 30 07:46:19 crc kubenswrapper[4760]: I0930 07:46:19.828364 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerStarted","Data":"9ca37c299871442165175723aa160e44f88bbaa555a2a12583d4390e53262571"} Sep 30 07:46:19 crc kubenswrapper[4760]: I0930 07:46:19.828386 4760 scope.go:117] "RemoveContainer" containerID="c14c5e22dc0508a193bdba7225efcdfbf417d8b9976aacad55c5d22c10bc7a92" Sep 30 07:46:20 crc kubenswrapper[4760]: I0930 07:46:20.836891 4760 generic.go:334] "Generic (PLEG): container finished" podID="efb37104-b575-43d3-9cc5-84f022a1c02a" containerID="26da0995f1c952c6ac201d8beb2efccf05afcddf7eed19fd8dac3066f86e573c" exitCode=0 Sep 30 07:46:20 crc kubenswrapper[4760]: I0930 07:46:20.836993 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbb7q" event={"ID":"efb37104-b575-43d3-9cc5-84f022a1c02a","Type":"ContainerDied","Data":"26da0995f1c952c6ac201d8beb2efccf05afcddf7eed19fd8dac3066f86e573c"} Sep 30 07:46:21 crc kubenswrapper[4760]: I0930 07:46:21.849903 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbb7q" event={"ID":"efb37104-b575-43d3-9cc5-84f022a1c02a","Type":"ContainerStarted","Data":"d0c6678f0f1884d74cab596a35dcbd5260c80714017afd1429cb1d99edb9b160"} Sep 30 07:46:21 crc kubenswrapper[4760]: I0930 07:46:21.870908 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rbb7q" podStartSLOduration=3.223615645 podStartE2EDuration="5.8708899s" podCreationTimestamp="2025-09-30 07:46:16 +0000 UTC" firstStartedPulling="2025-09-30 07:46:18.820760446 +0000 UTC m=+764.463666858" lastFinishedPulling="2025-09-30 07:46:21.468034681 +0000 UTC m=+767.110941113" observedRunningTime="2025-09-30 07:46:21.869444613 +0000 UTC m=+767.512351035" watchObservedRunningTime="2025-09-30 07:46:21.8708899 +0000 UTC m=+767.513796312" Sep 30 07:46:27 crc kubenswrapper[4760]: I0930 07:46:27.032664 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rbb7q" Sep 30 07:46:27 crc kubenswrapper[4760]: I0930 07:46:27.033376 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rbb7q" Sep 30 07:46:27 crc kubenswrapper[4760]: I0930 07:46:27.097842 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rbb7q" Sep 30 07:46:27 crc kubenswrapper[4760]: I0930 07:46:27.963617 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rbb7q" Sep 30 07:46:28 crc kubenswrapper[4760]: I0930 07:46:28.021983 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbb7q"] Sep 30 07:46:29 crc kubenswrapper[4760]: I0930 07:46:29.906992 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rbb7q" podUID="efb37104-b575-43d3-9cc5-84f022a1c02a" containerName="registry-server" containerID="cri-o://d0c6678f0f1884d74cab596a35dcbd5260c80714017afd1429cb1d99edb9b160" gracePeriod=2 Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.496171 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rbb7q" Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.523592 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5c2d\" (UniqueName: \"kubernetes.io/projected/efb37104-b575-43d3-9cc5-84f022a1c02a-kube-api-access-h5c2d\") pod \"efb37104-b575-43d3-9cc5-84f022a1c02a\" (UID: \"efb37104-b575-43d3-9cc5-84f022a1c02a\") " Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.523654 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb37104-b575-43d3-9cc5-84f022a1c02a-catalog-content\") pod \"efb37104-b575-43d3-9cc5-84f022a1c02a\" (UID: \"efb37104-b575-43d3-9cc5-84f022a1c02a\") " Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.523682 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb37104-b575-43d3-9cc5-84f022a1c02a-utilities\") pod \"efb37104-b575-43d3-9cc5-84f022a1c02a\" (UID: \"efb37104-b575-43d3-9cc5-84f022a1c02a\") " Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.524932 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efb37104-b575-43d3-9cc5-84f022a1c02a-utilities" (OuterVolumeSpecName: "utilities") pod "efb37104-b575-43d3-9cc5-84f022a1c02a" (UID: "efb37104-b575-43d3-9cc5-84f022a1c02a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.531382 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb37104-b575-43d3-9cc5-84f022a1c02a-kube-api-access-h5c2d" (OuterVolumeSpecName: "kube-api-access-h5c2d") pod "efb37104-b575-43d3-9cc5-84f022a1c02a" (UID: "efb37104-b575-43d3-9cc5-84f022a1c02a"). InnerVolumeSpecName "kube-api-access-h5c2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.548589 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efb37104-b575-43d3-9cc5-84f022a1c02a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "efb37104-b575-43d3-9cc5-84f022a1c02a" (UID: "efb37104-b575-43d3-9cc5-84f022a1c02a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.624441 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5c2d\" (UniqueName: \"kubernetes.io/projected/efb37104-b575-43d3-9cc5-84f022a1c02a-kube-api-access-h5c2d\") on node \"crc\" DevicePath \"\"" Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.624674 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb37104-b575-43d3-9cc5-84f022a1c02a-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.624794 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb37104-b575-43d3-9cc5-84f022a1c02a-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.754128 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4jm6v"] Sep 30 07:46:30 crc kubenswrapper[4760]: E0930 07:46:30.754433 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb37104-b575-43d3-9cc5-84f022a1c02a" containerName="registry-server" Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.754451 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb37104-b575-43d3-9cc5-84f022a1c02a" containerName="registry-server" Sep 30 07:46:30 crc kubenswrapper[4760]: E0930 07:46:30.754480 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb37104-b575-43d3-9cc5-84f022a1c02a" containerName="extract-utilities" Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.754488 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb37104-b575-43d3-9cc5-84f022a1c02a" containerName="extract-utilities" Sep 30 07:46:30 crc kubenswrapper[4760]: E0930 07:46:30.754499 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb37104-b575-43d3-9cc5-84f022a1c02a" containerName="extract-content" Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.754509 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb37104-b575-43d3-9cc5-84f022a1c02a" containerName="extract-content" Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.754620 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb37104-b575-43d3-9cc5-84f022a1c02a" containerName="registry-server" Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.755407 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4jm6v" Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.762087 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4jm6v"] Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.917677 4760 generic.go:334] "Generic (PLEG): container finished" podID="efb37104-b575-43d3-9cc5-84f022a1c02a" containerID="d0c6678f0f1884d74cab596a35dcbd5260c80714017afd1429cb1d99edb9b160" exitCode=0 Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.918421 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbb7q" event={"ID":"efb37104-b575-43d3-9cc5-84f022a1c02a","Type":"ContainerDied","Data":"d0c6678f0f1884d74cab596a35dcbd5260c80714017afd1429cb1d99edb9b160"} Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.918473 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbb7q" event={"ID":"efb37104-b575-43d3-9cc5-84f022a1c02a","Type":"ContainerDied","Data":"006632047ae0f6a0d75f0a3fb70ed5842819cf8d97369f5f41a3c4435335713c"} Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.918498 4760 scope.go:117] "RemoveContainer" containerID="d0c6678f0f1884d74cab596a35dcbd5260c80714017afd1429cb1d99edb9b160" Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.918664 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rbb7q" Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.928719 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd5cm\" (UniqueName: \"kubernetes.io/projected/1902c3f9-4bf7-428b-a316-6e17ea02e678-kube-api-access-bd5cm\") pod \"certified-operators-4jm6v\" (UID: \"1902c3f9-4bf7-428b-a316-6e17ea02e678\") " pod="openshift-marketplace/certified-operators-4jm6v" Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.928819 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1902c3f9-4bf7-428b-a316-6e17ea02e678-utilities\") pod \"certified-operators-4jm6v\" (UID: \"1902c3f9-4bf7-428b-a316-6e17ea02e678\") " pod="openshift-marketplace/certified-operators-4jm6v" Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.928894 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1902c3f9-4bf7-428b-a316-6e17ea02e678-catalog-content\") pod \"certified-operators-4jm6v\" (UID: \"1902c3f9-4bf7-428b-a316-6e17ea02e678\") " pod="openshift-marketplace/certified-operators-4jm6v" Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.950489 4760 scope.go:117] "RemoveContainer" containerID="26da0995f1c952c6ac201d8beb2efccf05afcddf7eed19fd8dac3066f86e573c" Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.955565 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbb7q"] Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.961410 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbb7q"] Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.969577 4760 scope.go:117] "RemoveContainer" containerID="879daea9d9d10d3f476785f7d3bccfa6c36f2f6461d0271284520843c3a02d05" Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.993177 4760 scope.go:117] "RemoveContainer" containerID="d0c6678f0f1884d74cab596a35dcbd5260c80714017afd1429cb1d99edb9b160" Sep 30 07:46:30 crc kubenswrapper[4760]: E0930 07:46:30.993869 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0c6678f0f1884d74cab596a35dcbd5260c80714017afd1429cb1d99edb9b160\": container with ID starting with d0c6678f0f1884d74cab596a35dcbd5260c80714017afd1429cb1d99edb9b160 not found: ID does not exist" containerID="d0c6678f0f1884d74cab596a35dcbd5260c80714017afd1429cb1d99edb9b160" Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.993990 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0c6678f0f1884d74cab596a35dcbd5260c80714017afd1429cb1d99edb9b160"} err="failed to get container status \"d0c6678f0f1884d74cab596a35dcbd5260c80714017afd1429cb1d99edb9b160\": rpc error: code = NotFound desc = could not find container \"d0c6678f0f1884d74cab596a35dcbd5260c80714017afd1429cb1d99edb9b160\": container with ID starting with d0c6678f0f1884d74cab596a35dcbd5260c80714017afd1429cb1d99edb9b160 not found: ID does not exist" Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.994114 4760 scope.go:117] "RemoveContainer" containerID="26da0995f1c952c6ac201d8beb2efccf05afcddf7eed19fd8dac3066f86e573c" Sep 30 07:46:30 crc kubenswrapper[4760]: E0930 07:46:30.994588 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26da0995f1c952c6ac201d8beb2efccf05afcddf7eed19fd8dac3066f86e573c\": container with ID starting with 26da0995f1c952c6ac201d8beb2efccf05afcddf7eed19fd8dac3066f86e573c not found: ID does not exist" containerID="26da0995f1c952c6ac201d8beb2efccf05afcddf7eed19fd8dac3066f86e573c" Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.994615 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26da0995f1c952c6ac201d8beb2efccf05afcddf7eed19fd8dac3066f86e573c"} err="failed to get container status \"26da0995f1c952c6ac201d8beb2efccf05afcddf7eed19fd8dac3066f86e573c\": rpc error: code = NotFound desc = could not find container \"26da0995f1c952c6ac201d8beb2efccf05afcddf7eed19fd8dac3066f86e573c\": container with ID starting with 26da0995f1c952c6ac201d8beb2efccf05afcddf7eed19fd8dac3066f86e573c not found: ID does not exist" Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.994634 4760 scope.go:117] "RemoveContainer" containerID="879daea9d9d10d3f476785f7d3bccfa6c36f2f6461d0271284520843c3a02d05" Sep 30 07:46:30 crc kubenswrapper[4760]: E0930 07:46:30.995171 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"879daea9d9d10d3f476785f7d3bccfa6c36f2f6461d0271284520843c3a02d05\": container with ID starting with 879daea9d9d10d3f476785f7d3bccfa6c36f2f6461d0271284520843c3a02d05 not found: ID does not exist" containerID="879daea9d9d10d3f476785f7d3bccfa6c36f2f6461d0271284520843c3a02d05" Sep 30 07:46:30 crc kubenswrapper[4760]: I0930 07:46:30.995199 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"879daea9d9d10d3f476785f7d3bccfa6c36f2f6461d0271284520843c3a02d05"} err="failed to get container status \"879daea9d9d10d3f476785f7d3bccfa6c36f2f6461d0271284520843c3a02d05\": rpc error: code = NotFound desc = could not find container \"879daea9d9d10d3f476785f7d3bccfa6c36f2f6461d0271284520843c3a02d05\": container with ID starting with 879daea9d9d10d3f476785f7d3bccfa6c36f2f6461d0271284520843c3a02d05 not found: ID does not exist" Sep 30 07:46:31 crc kubenswrapper[4760]: I0930 07:46:31.029891 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1902c3f9-4bf7-428b-a316-6e17ea02e678-catalog-content\") pod \"certified-operators-4jm6v\" (UID: \"1902c3f9-4bf7-428b-a316-6e17ea02e678\") " pod="openshift-marketplace/certified-operators-4jm6v" Sep 30 07:46:31 crc kubenswrapper[4760]: I0930 07:46:31.030292 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1902c3f9-4bf7-428b-a316-6e17ea02e678-catalog-content\") pod \"certified-operators-4jm6v\" (UID: \"1902c3f9-4bf7-428b-a316-6e17ea02e678\") " pod="openshift-marketplace/certified-operators-4jm6v" Sep 30 07:46:31 crc kubenswrapper[4760]: I0930 07:46:31.030425 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd5cm\" (UniqueName: \"kubernetes.io/projected/1902c3f9-4bf7-428b-a316-6e17ea02e678-kube-api-access-bd5cm\") pod \"certified-operators-4jm6v\" (UID: \"1902c3f9-4bf7-428b-a316-6e17ea02e678\") " pod="openshift-marketplace/certified-operators-4jm6v" Sep 30 07:46:31 crc kubenswrapper[4760]: I0930 07:46:31.030554 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1902c3f9-4bf7-428b-a316-6e17ea02e678-utilities\") pod \"certified-operators-4jm6v\" (UID: \"1902c3f9-4bf7-428b-a316-6e17ea02e678\") " pod="openshift-marketplace/certified-operators-4jm6v" Sep 30 07:46:31 crc kubenswrapper[4760]: I0930 07:46:31.030874 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1902c3f9-4bf7-428b-a316-6e17ea02e678-utilities\") pod \"certified-operators-4jm6v\" (UID: \"1902c3f9-4bf7-428b-a316-6e17ea02e678\") " pod="openshift-marketplace/certified-operators-4jm6v" Sep 30 07:46:31 crc kubenswrapper[4760]: I0930 07:46:31.048106 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd5cm\" (UniqueName: \"kubernetes.io/projected/1902c3f9-4bf7-428b-a316-6e17ea02e678-kube-api-access-bd5cm\") pod \"certified-operators-4jm6v\" (UID: \"1902c3f9-4bf7-428b-a316-6e17ea02e678\") " pod="openshift-marketplace/certified-operators-4jm6v" Sep 30 07:46:31 crc kubenswrapper[4760]: I0930 07:46:31.074223 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efb37104-b575-43d3-9cc5-84f022a1c02a" path="/var/lib/kubelet/pods/efb37104-b575-43d3-9cc5-84f022a1c02a/volumes" Sep 30 07:46:31 crc kubenswrapper[4760]: I0930 07:46:31.088459 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4jm6v" Sep 30 07:46:31 crc kubenswrapper[4760]: I0930 07:46:31.546805 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4jm6v"] Sep 30 07:46:31 crc kubenswrapper[4760]: W0930 07:46:31.550906 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1902c3f9_4bf7_428b_a316_6e17ea02e678.slice/crio-afca7bd4f2635368a749e0f1d0614b3425d46d0652f6f3e6a715edc8869d1fe1 WatchSource:0}: Error finding container afca7bd4f2635368a749e0f1d0614b3425d46d0652f6f3e6a715edc8869d1fe1: Status 404 returned error can't find the container with id afca7bd4f2635368a749e0f1d0614b3425d46d0652f6f3e6a715edc8869d1fe1 Sep 30 07:46:31 crc kubenswrapper[4760]: I0930 07:46:31.927047 4760 generic.go:334] "Generic (PLEG): container finished" podID="1902c3f9-4bf7-428b-a316-6e17ea02e678" containerID="74e47b3172cfc3413868fff4cb630f7ca41199389e6dfde08ebf6da7130770a0" exitCode=0 Sep 30 07:46:31 crc kubenswrapper[4760]: I0930 07:46:31.927091 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4jm6v" event={"ID":"1902c3f9-4bf7-428b-a316-6e17ea02e678","Type":"ContainerDied","Data":"74e47b3172cfc3413868fff4cb630f7ca41199389e6dfde08ebf6da7130770a0"} Sep 30 07:46:31 crc kubenswrapper[4760]: I0930 07:46:31.927114 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4jm6v" event={"ID":"1902c3f9-4bf7-428b-a316-6e17ea02e678","Type":"ContainerStarted","Data":"afca7bd4f2635368a749e0f1d0614b3425d46d0652f6f3e6a715edc8869d1fe1"} Sep 30 07:46:32 crc kubenswrapper[4760]: I0930 07:46:32.934969 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4jm6v" event={"ID":"1902c3f9-4bf7-428b-a316-6e17ea02e678","Type":"ContainerStarted","Data":"b97d3026d0bd0f23c2d47cd3406736eee73643d0f06e750320f855bc58e897c3"} Sep 30 07:46:33 crc kubenswrapper[4760]: I0930 07:46:33.947696 4760 generic.go:334] "Generic (PLEG): container finished" podID="1902c3f9-4bf7-428b-a316-6e17ea02e678" containerID="b97d3026d0bd0f23c2d47cd3406736eee73643d0f06e750320f855bc58e897c3" exitCode=0 Sep 30 07:46:33 crc kubenswrapper[4760]: I0930 07:46:33.947822 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4jm6v" event={"ID":"1902c3f9-4bf7-428b-a316-6e17ea02e678","Type":"ContainerDied","Data":"b97d3026d0bd0f23c2d47cd3406736eee73643d0f06e750320f855bc58e897c3"} Sep 30 07:46:34 crc kubenswrapper[4760]: I0930 07:46:34.959558 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4jm6v" event={"ID":"1902c3f9-4bf7-428b-a316-6e17ea02e678","Type":"ContainerStarted","Data":"7b10e56f05ac9c8f2ed07f53d2447797ca76ff8eccbbb5570388acc5c5badf05"} Sep 30 07:46:34 crc kubenswrapper[4760]: I0930 07:46:34.978426 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4jm6v" podStartSLOduration=2.524637956 podStartE2EDuration="4.978407016s" podCreationTimestamp="2025-09-30 07:46:30 +0000 UTC" firstStartedPulling="2025-09-30 07:46:31.929170432 +0000 UTC m=+777.572076844" lastFinishedPulling="2025-09-30 07:46:34.382939482 +0000 UTC m=+780.025845904" observedRunningTime="2025-09-30 07:46:34.977049612 +0000 UTC m=+780.619956024" watchObservedRunningTime="2025-09-30 07:46:34.978407016 +0000 UTC m=+780.621313428" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.021116 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7db464cf7c-k5lfr" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.680744 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-tqbpr"] Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.684083 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-69f49"] Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.684269 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-tqbpr" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.684912 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-69f49" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.688861 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.688903 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-q2srp" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.689029 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.689095 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.701821 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-69f49"] Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.753031 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-ldvzg"] Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.754056 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-ldvzg" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.756588 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.756610 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.756607 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.756728 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-z7w8l" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.757930 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/213dbc76-320c-463f-8133-946be9ece565-metrics\") pod \"frr-k8s-tqbpr\" (UID: \"213dbc76-320c-463f-8133-946be9ece565\") " pod="metallb-system/frr-k8s-tqbpr" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.758031 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e89684e7-e01f-4427-9479-999c5f101902-memberlist\") pod \"speaker-ldvzg\" (UID: \"e89684e7-e01f-4427-9479-999c5f101902\") " pod="metallb-system/speaker-ldvzg" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.758102 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/213dbc76-320c-463f-8133-946be9ece565-metrics-certs\") pod \"frr-k8s-tqbpr\" (UID: \"213dbc76-320c-463f-8133-946be9ece565\") " pod="metallb-system/frr-k8s-tqbpr" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.758130 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ffa35c7d-6788-4902-8863-7346389154cd-cert\") pod \"frr-k8s-webhook-server-5478bdb765-69f49\" (UID: \"ffa35c7d-6788-4902-8863-7346389154cd\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-69f49" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.758185 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/e89684e7-e01f-4427-9479-999c5f101902-metallb-excludel2\") pod \"speaker-ldvzg\" (UID: \"e89684e7-e01f-4427-9479-999c5f101902\") " pod="metallb-system/speaker-ldvzg" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.758211 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/213dbc76-320c-463f-8133-946be9ece565-reloader\") pod \"frr-k8s-tqbpr\" (UID: \"213dbc76-320c-463f-8133-946be9ece565\") " pod="metallb-system/frr-k8s-tqbpr" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.758279 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/213dbc76-320c-463f-8133-946be9ece565-frr-startup\") pod \"frr-k8s-tqbpr\" (UID: \"213dbc76-320c-463f-8133-946be9ece565\") " pod="metallb-system/frr-k8s-tqbpr" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.758358 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dspn7\" (UniqueName: \"kubernetes.io/projected/213dbc76-320c-463f-8133-946be9ece565-kube-api-access-dspn7\") pod \"frr-k8s-tqbpr\" (UID: \"213dbc76-320c-463f-8133-946be9ece565\") " pod="metallb-system/frr-k8s-tqbpr" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.758412 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/213dbc76-320c-463f-8133-946be9ece565-frr-sockets\") pod \"frr-k8s-tqbpr\" (UID: \"213dbc76-320c-463f-8133-946be9ece565\") " pod="metallb-system/frr-k8s-tqbpr" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.758466 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmj4q\" (UniqueName: \"kubernetes.io/projected/e89684e7-e01f-4427-9479-999c5f101902-kube-api-access-lmj4q\") pod \"speaker-ldvzg\" (UID: \"e89684e7-e01f-4427-9479-999c5f101902\") " pod="metallb-system/speaker-ldvzg" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.758527 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svmld\" (UniqueName: \"kubernetes.io/projected/ffa35c7d-6788-4902-8863-7346389154cd-kube-api-access-svmld\") pod \"frr-k8s-webhook-server-5478bdb765-69f49\" (UID: \"ffa35c7d-6788-4902-8863-7346389154cd\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-69f49" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.758562 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e89684e7-e01f-4427-9479-999c5f101902-metrics-certs\") pod \"speaker-ldvzg\" (UID: \"e89684e7-e01f-4427-9479-999c5f101902\") " pod="metallb-system/speaker-ldvzg" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.758616 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/213dbc76-320c-463f-8133-946be9ece565-frr-conf\") pod \"frr-k8s-tqbpr\" (UID: \"213dbc76-320c-463f-8133-946be9ece565\") " pod="metallb-system/frr-k8s-tqbpr" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.772450 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5d688f5ffc-ldwsd"] Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.773651 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-ldwsd" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.775933 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.785716 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-ldwsd"] Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.859390 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/213dbc76-320c-463f-8133-946be9ece565-metrics\") pod \"frr-k8s-tqbpr\" (UID: \"213dbc76-320c-463f-8133-946be9ece565\") " pod="metallb-system/frr-k8s-tqbpr" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.859441 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e89684e7-e01f-4427-9479-999c5f101902-memberlist\") pod \"speaker-ldvzg\" (UID: \"e89684e7-e01f-4427-9479-999c5f101902\") " pod="metallb-system/speaker-ldvzg" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.859465 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/213dbc76-320c-463f-8133-946be9ece565-metrics-certs\") pod \"frr-k8s-tqbpr\" (UID: \"213dbc76-320c-463f-8133-946be9ece565\") " pod="metallb-system/frr-k8s-tqbpr" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.859497 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ffa35c7d-6788-4902-8863-7346389154cd-cert\") pod \"frr-k8s-webhook-server-5478bdb765-69f49\" (UID: \"ffa35c7d-6788-4902-8863-7346389154cd\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-69f49" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.859515 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/e89684e7-e01f-4427-9479-999c5f101902-metallb-excludel2\") pod \"speaker-ldvzg\" (UID: \"e89684e7-e01f-4427-9479-999c5f101902\") " pod="metallb-system/speaker-ldvzg" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.859529 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/213dbc76-320c-463f-8133-946be9ece565-reloader\") pod \"frr-k8s-tqbpr\" (UID: \"213dbc76-320c-463f-8133-946be9ece565\") " pod="metallb-system/frr-k8s-tqbpr" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.859547 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/670bf4cb-7ea4-4ffb-af92-0f727878a518-metrics-certs\") pod \"controller-5d688f5ffc-ldwsd\" (UID: \"670bf4cb-7ea4-4ffb-af92-0f727878a518\") " pod="metallb-system/controller-5d688f5ffc-ldwsd" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.859571 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/213dbc76-320c-463f-8133-946be9ece565-frr-startup\") pod \"frr-k8s-tqbpr\" (UID: \"213dbc76-320c-463f-8133-946be9ece565\") " pod="metallb-system/frr-k8s-tqbpr" Sep 30 07:46:39 crc kubenswrapper[4760]: E0930 07:46:39.859582 4760 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.859598 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dspn7\" (UniqueName: \"kubernetes.io/projected/213dbc76-320c-463f-8133-946be9ece565-kube-api-access-dspn7\") pod \"frr-k8s-tqbpr\" (UID: \"213dbc76-320c-463f-8133-946be9ece565\") " pod="metallb-system/frr-k8s-tqbpr" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.859623 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/213dbc76-320c-463f-8133-946be9ece565-frr-sockets\") pod \"frr-k8s-tqbpr\" (UID: \"213dbc76-320c-463f-8133-946be9ece565\") " pod="metallb-system/frr-k8s-tqbpr" Sep 30 07:46:39 crc kubenswrapper[4760]: E0930 07:46:39.859640 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e89684e7-e01f-4427-9479-999c5f101902-memberlist podName:e89684e7-e01f-4427-9479-999c5f101902 nodeName:}" failed. No retries permitted until 2025-09-30 07:46:40.359621931 +0000 UTC m=+786.002528343 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/e89684e7-e01f-4427-9479-999c5f101902-memberlist") pod "speaker-ldvzg" (UID: "e89684e7-e01f-4427-9479-999c5f101902") : secret "metallb-memberlist" not found Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.859657 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/670bf4cb-7ea4-4ffb-af92-0f727878a518-cert\") pod \"controller-5d688f5ffc-ldwsd\" (UID: \"670bf4cb-7ea4-4ffb-af92-0f727878a518\") " pod="metallb-system/controller-5d688f5ffc-ldwsd" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.859679 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmj4q\" (UniqueName: \"kubernetes.io/projected/e89684e7-e01f-4427-9479-999c5f101902-kube-api-access-lmj4q\") pod \"speaker-ldvzg\" (UID: \"e89684e7-e01f-4427-9479-999c5f101902\") " pod="metallb-system/speaker-ldvzg" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.859708 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svmld\" (UniqueName: \"kubernetes.io/projected/ffa35c7d-6788-4902-8863-7346389154cd-kube-api-access-svmld\") pod \"frr-k8s-webhook-server-5478bdb765-69f49\" (UID: \"ffa35c7d-6788-4902-8863-7346389154cd\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-69f49" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.859738 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e89684e7-e01f-4427-9479-999c5f101902-metrics-certs\") pod \"speaker-ldvzg\" (UID: \"e89684e7-e01f-4427-9479-999c5f101902\") " pod="metallb-system/speaker-ldvzg" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.859756 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/213dbc76-320c-463f-8133-946be9ece565-frr-conf\") pod \"frr-k8s-tqbpr\" (UID: \"213dbc76-320c-463f-8133-946be9ece565\") " pod="metallb-system/frr-k8s-tqbpr" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.859787 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctmrz\" (UniqueName: \"kubernetes.io/projected/670bf4cb-7ea4-4ffb-af92-0f727878a518-kube-api-access-ctmrz\") pod \"controller-5d688f5ffc-ldwsd\" (UID: \"670bf4cb-7ea4-4ffb-af92-0f727878a518\") " pod="metallb-system/controller-5d688f5ffc-ldwsd" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.860140 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/213dbc76-320c-463f-8133-946be9ece565-metrics\") pod \"frr-k8s-tqbpr\" (UID: \"213dbc76-320c-463f-8133-946be9ece565\") " pod="metallb-system/frr-k8s-tqbpr" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.860158 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/213dbc76-320c-463f-8133-946be9ece565-reloader\") pod \"frr-k8s-tqbpr\" (UID: \"213dbc76-320c-463f-8133-946be9ece565\") " pod="metallb-system/frr-k8s-tqbpr" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.860492 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/213dbc76-320c-463f-8133-946be9ece565-frr-sockets\") pod \"frr-k8s-tqbpr\" (UID: \"213dbc76-320c-463f-8133-946be9ece565\") " pod="metallb-system/frr-k8s-tqbpr" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.860563 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/e89684e7-e01f-4427-9479-999c5f101902-metallb-excludel2\") pod \"speaker-ldvzg\" (UID: \"e89684e7-e01f-4427-9479-999c5f101902\") " pod="metallb-system/speaker-ldvzg" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.860640 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/213dbc76-320c-463f-8133-946be9ece565-frr-conf\") pod \"frr-k8s-tqbpr\" (UID: \"213dbc76-320c-463f-8133-946be9ece565\") " pod="metallb-system/frr-k8s-tqbpr" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.860709 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/213dbc76-320c-463f-8133-946be9ece565-frr-startup\") pod \"frr-k8s-tqbpr\" (UID: \"213dbc76-320c-463f-8133-946be9ece565\") " pod="metallb-system/frr-k8s-tqbpr" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.865000 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e89684e7-e01f-4427-9479-999c5f101902-metrics-certs\") pod \"speaker-ldvzg\" (UID: \"e89684e7-e01f-4427-9479-999c5f101902\") " pod="metallb-system/speaker-ldvzg" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.868884 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ffa35c7d-6788-4902-8863-7346389154cd-cert\") pod \"frr-k8s-webhook-server-5478bdb765-69f49\" (UID: \"ffa35c7d-6788-4902-8863-7346389154cd\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-69f49" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.878700 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/213dbc76-320c-463f-8133-946be9ece565-metrics-certs\") pod \"frr-k8s-tqbpr\" (UID: \"213dbc76-320c-463f-8133-946be9ece565\") " pod="metallb-system/frr-k8s-tqbpr" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.878747 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmj4q\" (UniqueName: \"kubernetes.io/projected/e89684e7-e01f-4427-9479-999c5f101902-kube-api-access-lmj4q\") pod \"speaker-ldvzg\" (UID: \"e89684e7-e01f-4427-9479-999c5f101902\") " pod="metallb-system/speaker-ldvzg" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.885761 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dspn7\" (UniqueName: \"kubernetes.io/projected/213dbc76-320c-463f-8133-946be9ece565-kube-api-access-dspn7\") pod \"frr-k8s-tqbpr\" (UID: \"213dbc76-320c-463f-8133-946be9ece565\") " pod="metallb-system/frr-k8s-tqbpr" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.890011 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svmld\" (UniqueName: \"kubernetes.io/projected/ffa35c7d-6788-4902-8863-7346389154cd-kube-api-access-svmld\") pod \"frr-k8s-webhook-server-5478bdb765-69f49\" (UID: \"ffa35c7d-6788-4902-8863-7346389154cd\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-69f49" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.960657 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctmrz\" (UniqueName: \"kubernetes.io/projected/670bf4cb-7ea4-4ffb-af92-0f727878a518-kube-api-access-ctmrz\") pod \"controller-5d688f5ffc-ldwsd\" (UID: \"670bf4cb-7ea4-4ffb-af92-0f727878a518\") " pod="metallb-system/controller-5d688f5ffc-ldwsd" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.960749 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/670bf4cb-7ea4-4ffb-af92-0f727878a518-metrics-certs\") pod \"controller-5d688f5ffc-ldwsd\" (UID: \"670bf4cb-7ea4-4ffb-af92-0f727878a518\") " pod="metallb-system/controller-5d688f5ffc-ldwsd" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.960803 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/670bf4cb-7ea4-4ffb-af92-0f727878a518-cert\") pod \"controller-5d688f5ffc-ldwsd\" (UID: \"670bf4cb-7ea4-4ffb-af92-0f727878a518\") " pod="metallb-system/controller-5d688f5ffc-ldwsd" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.962393 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.963892 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/670bf4cb-7ea4-4ffb-af92-0f727878a518-metrics-certs\") pod \"controller-5d688f5ffc-ldwsd\" (UID: \"670bf4cb-7ea4-4ffb-af92-0f727878a518\") " pod="metallb-system/controller-5d688f5ffc-ldwsd" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.974624 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/670bf4cb-7ea4-4ffb-af92-0f727878a518-cert\") pod \"controller-5d688f5ffc-ldwsd\" (UID: \"670bf4cb-7ea4-4ffb-af92-0f727878a518\") " pod="metallb-system/controller-5d688f5ffc-ldwsd" Sep 30 07:46:39 crc kubenswrapper[4760]: I0930 07:46:39.978618 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctmrz\" (UniqueName: \"kubernetes.io/projected/670bf4cb-7ea4-4ffb-af92-0f727878a518-kube-api-access-ctmrz\") pod \"controller-5d688f5ffc-ldwsd\" (UID: \"670bf4cb-7ea4-4ffb-af92-0f727878a518\") " pod="metallb-system/controller-5d688f5ffc-ldwsd" Sep 30 07:46:40 crc kubenswrapper[4760]: I0930 07:46:40.001775 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-tqbpr" Sep 30 07:46:40 crc kubenswrapper[4760]: I0930 07:46:40.009710 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-69f49" Sep 30 07:46:40 crc kubenswrapper[4760]: I0930 07:46:40.086799 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-ldwsd" Sep 30 07:46:40 crc kubenswrapper[4760]: I0930 07:46:40.332333 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-ldwsd"] Sep 30 07:46:40 crc kubenswrapper[4760]: W0930 07:46:40.334328 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod670bf4cb_7ea4_4ffb_af92_0f727878a518.slice/crio-613e1404f2b7b236a6394950c3139089a7bf0158a262afecaccd0055483b612b WatchSource:0}: Error finding container 613e1404f2b7b236a6394950c3139089a7bf0158a262afecaccd0055483b612b: Status 404 returned error can't find the container with id 613e1404f2b7b236a6394950c3139089a7bf0158a262afecaccd0055483b612b Sep 30 07:46:40 crc kubenswrapper[4760]: I0930 07:46:40.365811 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e89684e7-e01f-4427-9479-999c5f101902-memberlist\") pod \"speaker-ldvzg\" (UID: \"e89684e7-e01f-4427-9479-999c5f101902\") " pod="metallb-system/speaker-ldvzg" Sep 30 07:46:40 crc kubenswrapper[4760]: E0930 07:46:40.366123 4760 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 30 07:46:40 crc kubenswrapper[4760]: E0930 07:46:40.366213 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e89684e7-e01f-4427-9479-999c5f101902-memberlist podName:e89684e7-e01f-4427-9479-999c5f101902 nodeName:}" failed. No retries permitted until 2025-09-30 07:46:41.366191944 +0000 UTC m=+787.009098356 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/e89684e7-e01f-4427-9479-999c5f101902-memberlist") pod "speaker-ldvzg" (UID: "e89684e7-e01f-4427-9479-999c5f101902") : secret "metallb-memberlist" not found Sep 30 07:46:40 crc kubenswrapper[4760]: I0930 07:46:40.456248 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-69f49"] Sep 30 07:46:40 crc kubenswrapper[4760]: W0930 07:46:40.462581 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffa35c7d_6788_4902_8863_7346389154cd.slice/crio-4cb6b9bea789a19e726e6f407e35dc338d3671066c1ade8c1cce98ab2f40c066 WatchSource:0}: Error finding container 4cb6b9bea789a19e726e6f407e35dc338d3671066c1ade8c1cce98ab2f40c066: Status 404 returned error can't find the container with id 4cb6b9bea789a19e726e6f407e35dc338d3671066c1ade8c1cce98ab2f40c066 Sep 30 07:46:41 crc kubenswrapper[4760]: I0930 07:46:41.001188 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-69f49" event={"ID":"ffa35c7d-6788-4902-8863-7346389154cd","Type":"ContainerStarted","Data":"4cb6b9bea789a19e726e6f407e35dc338d3671066c1ade8c1cce98ab2f40c066"} Sep 30 07:46:41 crc kubenswrapper[4760]: I0930 07:46:41.002953 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tqbpr" event={"ID":"213dbc76-320c-463f-8133-946be9ece565","Type":"ContainerStarted","Data":"eea2c68711589706d1b5a766352672366b61d1cae13146249628618f65d86d8d"} Sep 30 07:46:41 crc kubenswrapper[4760]: I0930 07:46:41.005369 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-ldwsd" event={"ID":"670bf4cb-7ea4-4ffb-af92-0f727878a518","Type":"ContainerStarted","Data":"689db26f1803e10f3ec75d8cd9cae2490b91e3866e10bc1e31168e5115ecaa26"} Sep 30 07:46:41 crc kubenswrapper[4760]: I0930 07:46:41.005423 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-ldwsd" event={"ID":"670bf4cb-7ea4-4ffb-af92-0f727878a518","Type":"ContainerStarted","Data":"f7f188800dc10828ef2a5baf0496a9ba4e628936ae571376fbb7cd2bb0876245"} Sep 30 07:46:41 crc kubenswrapper[4760]: I0930 07:46:41.005440 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-ldwsd" event={"ID":"670bf4cb-7ea4-4ffb-af92-0f727878a518","Type":"ContainerStarted","Data":"613e1404f2b7b236a6394950c3139089a7bf0158a262afecaccd0055483b612b"} Sep 30 07:46:41 crc kubenswrapper[4760]: I0930 07:46:41.005603 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5d688f5ffc-ldwsd" Sep 30 07:46:41 crc kubenswrapper[4760]: I0930 07:46:41.032101 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5d688f5ffc-ldwsd" podStartSLOduration=2.032072858 podStartE2EDuration="2.032072858s" podCreationTimestamp="2025-09-30 07:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:46:41.024742611 +0000 UTC m=+786.667649063" watchObservedRunningTime="2025-09-30 07:46:41.032072858 +0000 UTC m=+786.674979310" Sep 30 07:46:41 crc kubenswrapper[4760]: I0930 07:46:41.089712 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4jm6v" Sep 30 07:46:41 crc kubenswrapper[4760]: I0930 07:46:41.089935 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4jm6v" Sep 30 07:46:41 crc kubenswrapper[4760]: I0930 07:46:41.159930 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4jm6v" Sep 30 07:46:41 crc kubenswrapper[4760]: I0930 07:46:41.385031 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e89684e7-e01f-4427-9479-999c5f101902-memberlist\") pod \"speaker-ldvzg\" (UID: \"e89684e7-e01f-4427-9479-999c5f101902\") " pod="metallb-system/speaker-ldvzg" Sep 30 07:46:41 crc kubenswrapper[4760]: I0930 07:46:41.396086 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e89684e7-e01f-4427-9479-999c5f101902-memberlist\") pod \"speaker-ldvzg\" (UID: \"e89684e7-e01f-4427-9479-999c5f101902\") " pod="metallb-system/speaker-ldvzg" Sep 30 07:46:41 crc kubenswrapper[4760]: I0930 07:46:41.573198 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-ldvzg" Sep 30 07:46:42 crc kubenswrapper[4760]: I0930 07:46:42.018327 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-ldvzg" event={"ID":"e89684e7-e01f-4427-9479-999c5f101902","Type":"ContainerStarted","Data":"4e922a481d15a7c29c25afcc49e9cf5d84a99144b6f55365dee4c049e3e478ca"} Sep 30 07:46:42 crc kubenswrapper[4760]: I0930 07:46:42.018393 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-ldvzg" event={"ID":"e89684e7-e01f-4427-9479-999c5f101902","Type":"ContainerStarted","Data":"e1d780af853d187b35d8f8fe2ef789d8910fa7affb3c1049331e59c4b8802957"} Sep 30 07:46:42 crc kubenswrapper[4760]: I0930 07:46:42.152224 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4jm6v" Sep 30 07:46:42 crc kubenswrapper[4760]: I0930 07:46:42.207601 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4jm6v"] Sep 30 07:46:43 crc kubenswrapper[4760]: I0930 07:46:43.026484 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-ldvzg" event={"ID":"e89684e7-e01f-4427-9479-999c5f101902","Type":"ContainerStarted","Data":"232494a6b7a06c88368d3da7d12f9af5709343287959cb99e0b6326be56fa242"} Sep 30 07:46:43 crc kubenswrapper[4760]: I0930 07:46:43.055392 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-ldvzg" podStartSLOduration=4.055371924 podStartE2EDuration="4.055371924s" podCreationTimestamp="2025-09-30 07:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:46:43.046411935 +0000 UTC m=+788.689318347" watchObservedRunningTime="2025-09-30 07:46:43.055371924 +0000 UTC m=+788.698278336" Sep 30 07:46:44 crc kubenswrapper[4760]: I0930 07:46:44.049320 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-ldvzg" Sep 30 07:46:44 crc kubenswrapper[4760]: I0930 07:46:44.049431 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4jm6v" podUID="1902c3f9-4bf7-428b-a316-6e17ea02e678" containerName="registry-server" containerID="cri-o://7b10e56f05ac9c8f2ed07f53d2447797ca76ff8eccbbb5570388acc5c5badf05" gracePeriod=2 Sep 30 07:46:45 crc kubenswrapper[4760]: I0930 07:46:45.024536 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4jm6v" Sep 30 07:46:45 crc kubenswrapper[4760]: I0930 07:46:45.058273 4760 generic.go:334] "Generic (PLEG): container finished" podID="1902c3f9-4bf7-428b-a316-6e17ea02e678" containerID="7b10e56f05ac9c8f2ed07f53d2447797ca76ff8eccbbb5570388acc5c5badf05" exitCode=0 Sep 30 07:46:45 crc kubenswrapper[4760]: I0930 07:46:45.058610 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4jm6v" event={"ID":"1902c3f9-4bf7-428b-a316-6e17ea02e678","Type":"ContainerDied","Data":"7b10e56f05ac9c8f2ed07f53d2447797ca76ff8eccbbb5570388acc5c5badf05"} Sep 30 07:46:45 crc kubenswrapper[4760]: I0930 07:46:45.058662 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4jm6v" event={"ID":"1902c3f9-4bf7-428b-a316-6e17ea02e678","Type":"ContainerDied","Data":"afca7bd4f2635368a749e0f1d0614b3425d46d0652f6f3e6a715edc8869d1fe1"} Sep 30 07:46:45 crc kubenswrapper[4760]: I0930 07:46:45.058685 4760 scope.go:117] "RemoveContainer" containerID="7b10e56f05ac9c8f2ed07f53d2447797ca76ff8eccbbb5570388acc5c5badf05" Sep 30 07:46:45 crc kubenswrapper[4760]: I0930 07:46:45.058620 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4jm6v" Sep 30 07:46:45 crc kubenswrapper[4760]: I0930 07:46:45.088867 4760 scope.go:117] "RemoveContainer" containerID="b97d3026d0bd0f23c2d47cd3406736eee73643d0f06e750320f855bc58e897c3" Sep 30 07:46:45 crc kubenswrapper[4760]: I0930 07:46:45.109691 4760 scope.go:117] "RemoveContainer" containerID="74e47b3172cfc3413868fff4cb630f7ca41199389e6dfde08ebf6da7130770a0" Sep 30 07:46:45 crc kubenswrapper[4760]: I0930 07:46:45.134536 4760 scope.go:117] "RemoveContainer" containerID="7b10e56f05ac9c8f2ed07f53d2447797ca76ff8eccbbb5570388acc5c5badf05" Sep 30 07:46:45 crc kubenswrapper[4760]: E0930 07:46:45.135206 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b10e56f05ac9c8f2ed07f53d2447797ca76ff8eccbbb5570388acc5c5badf05\": container with ID starting with 7b10e56f05ac9c8f2ed07f53d2447797ca76ff8eccbbb5570388acc5c5badf05 not found: ID does not exist" containerID="7b10e56f05ac9c8f2ed07f53d2447797ca76ff8eccbbb5570388acc5c5badf05" Sep 30 07:46:45 crc kubenswrapper[4760]: I0930 07:46:45.135265 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b10e56f05ac9c8f2ed07f53d2447797ca76ff8eccbbb5570388acc5c5badf05"} err="failed to get container status \"7b10e56f05ac9c8f2ed07f53d2447797ca76ff8eccbbb5570388acc5c5badf05\": rpc error: code = NotFound desc = could not find container \"7b10e56f05ac9c8f2ed07f53d2447797ca76ff8eccbbb5570388acc5c5badf05\": container with ID starting with 7b10e56f05ac9c8f2ed07f53d2447797ca76ff8eccbbb5570388acc5c5badf05 not found: ID does not exist" Sep 30 07:46:45 crc kubenswrapper[4760]: I0930 07:46:45.135285 4760 scope.go:117] "RemoveContainer" containerID="b97d3026d0bd0f23c2d47cd3406736eee73643d0f06e750320f855bc58e897c3" Sep 30 07:46:45 crc kubenswrapper[4760]: E0930 07:46:45.135715 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b97d3026d0bd0f23c2d47cd3406736eee73643d0f06e750320f855bc58e897c3\": container with ID starting with b97d3026d0bd0f23c2d47cd3406736eee73643d0f06e750320f855bc58e897c3 not found: ID does not exist" containerID="b97d3026d0bd0f23c2d47cd3406736eee73643d0f06e750320f855bc58e897c3" Sep 30 07:46:45 crc kubenswrapper[4760]: I0930 07:46:45.135754 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b97d3026d0bd0f23c2d47cd3406736eee73643d0f06e750320f855bc58e897c3"} err="failed to get container status \"b97d3026d0bd0f23c2d47cd3406736eee73643d0f06e750320f855bc58e897c3\": rpc error: code = NotFound desc = could not find container \"b97d3026d0bd0f23c2d47cd3406736eee73643d0f06e750320f855bc58e897c3\": container with ID starting with b97d3026d0bd0f23c2d47cd3406736eee73643d0f06e750320f855bc58e897c3 not found: ID does not exist" Sep 30 07:46:45 crc kubenswrapper[4760]: I0930 07:46:45.135782 4760 scope.go:117] "RemoveContainer" containerID="74e47b3172cfc3413868fff4cb630f7ca41199389e6dfde08ebf6da7130770a0" Sep 30 07:46:45 crc kubenswrapper[4760]: E0930 07:46:45.136217 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74e47b3172cfc3413868fff4cb630f7ca41199389e6dfde08ebf6da7130770a0\": container with ID starting with 74e47b3172cfc3413868fff4cb630f7ca41199389e6dfde08ebf6da7130770a0 not found: ID does not exist" containerID="74e47b3172cfc3413868fff4cb630f7ca41199389e6dfde08ebf6da7130770a0" Sep 30 07:46:45 crc kubenswrapper[4760]: I0930 07:46:45.136270 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74e47b3172cfc3413868fff4cb630f7ca41199389e6dfde08ebf6da7130770a0"} err="failed to get container status \"74e47b3172cfc3413868fff4cb630f7ca41199389e6dfde08ebf6da7130770a0\": rpc error: code = NotFound desc = could not find container \"74e47b3172cfc3413868fff4cb630f7ca41199389e6dfde08ebf6da7130770a0\": container with ID starting with 74e47b3172cfc3413868fff4cb630f7ca41199389e6dfde08ebf6da7130770a0 not found: ID does not exist" Sep 30 07:46:45 crc kubenswrapper[4760]: I0930 07:46:45.140421 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1902c3f9-4bf7-428b-a316-6e17ea02e678-catalog-content\") pod \"1902c3f9-4bf7-428b-a316-6e17ea02e678\" (UID: \"1902c3f9-4bf7-428b-a316-6e17ea02e678\") " Sep 30 07:46:45 crc kubenswrapper[4760]: I0930 07:46:45.140886 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd5cm\" (UniqueName: \"kubernetes.io/projected/1902c3f9-4bf7-428b-a316-6e17ea02e678-kube-api-access-bd5cm\") pod \"1902c3f9-4bf7-428b-a316-6e17ea02e678\" (UID: \"1902c3f9-4bf7-428b-a316-6e17ea02e678\") " Sep 30 07:46:45 crc kubenswrapper[4760]: I0930 07:46:45.141553 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1902c3f9-4bf7-428b-a316-6e17ea02e678-utilities\") pod \"1902c3f9-4bf7-428b-a316-6e17ea02e678\" (UID: \"1902c3f9-4bf7-428b-a316-6e17ea02e678\") " Sep 30 07:46:45 crc kubenswrapper[4760]: I0930 07:46:45.142638 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1902c3f9-4bf7-428b-a316-6e17ea02e678-utilities" (OuterVolumeSpecName: "utilities") pod "1902c3f9-4bf7-428b-a316-6e17ea02e678" (UID: "1902c3f9-4bf7-428b-a316-6e17ea02e678"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:46:45 crc kubenswrapper[4760]: I0930 07:46:45.148863 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1902c3f9-4bf7-428b-a316-6e17ea02e678-kube-api-access-bd5cm" (OuterVolumeSpecName: "kube-api-access-bd5cm") pod "1902c3f9-4bf7-428b-a316-6e17ea02e678" (UID: "1902c3f9-4bf7-428b-a316-6e17ea02e678"). InnerVolumeSpecName "kube-api-access-bd5cm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:46:45 crc kubenswrapper[4760]: I0930 07:46:45.187169 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1902c3f9-4bf7-428b-a316-6e17ea02e678-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1902c3f9-4bf7-428b-a316-6e17ea02e678" (UID: "1902c3f9-4bf7-428b-a316-6e17ea02e678"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:46:45 crc kubenswrapper[4760]: I0930 07:46:45.244060 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bd5cm\" (UniqueName: \"kubernetes.io/projected/1902c3f9-4bf7-428b-a316-6e17ea02e678-kube-api-access-bd5cm\") on node \"crc\" DevicePath \"\"" Sep 30 07:46:45 crc kubenswrapper[4760]: I0930 07:46:45.244098 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1902c3f9-4bf7-428b-a316-6e17ea02e678-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:46:45 crc kubenswrapper[4760]: I0930 07:46:45.244109 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1902c3f9-4bf7-428b-a316-6e17ea02e678-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:46:45 crc kubenswrapper[4760]: I0930 07:46:45.402855 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4jm6v"] Sep 30 07:46:45 crc kubenswrapper[4760]: I0930 07:46:45.409027 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4jm6v"] Sep 30 07:46:47 crc kubenswrapper[4760]: I0930 07:46:47.074128 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1902c3f9-4bf7-428b-a316-6e17ea02e678" path="/var/lib/kubelet/pods/1902c3f9-4bf7-428b-a316-6e17ea02e678/volumes" Sep 30 07:46:48 crc kubenswrapper[4760]: I0930 07:46:48.084271 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-69f49" event={"ID":"ffa35c7d-6788-4902-8863-7346389154cd","Type":"ContainerStarted","Data":"731e29488037a055749e2feb0a6facf9e78de8703215fcfaaec7760a44c39862"} Sep 30 07:46:48 crc kubenswrapper[4760]: I0930 07:46:48.084581 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-69f49" Sep 30 07:46:48 crc kubenswrapper[4760]: I0930 07:46:48.088701 4760 generic.go:334] "Generic (PLEG): container finished" podID="213dbc76-320c-463f-8133-946be9ece565" containerID="139725726a2bfe85c1351bf2ca8791ab30a0e4c848284de80d8e48200182629b" exitCode=0 Sep 30 07:46:48 crc kubenswrapper[4760]: I0930 07:46:48.088741 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tqbpr" event={"ID":"213dbc76-320c-463f-8133-946be9ece565","Type":"ContainerDied","Data":"139725726a2bfe85c1351bf2ca8791ab30a0e4c848284de80d8e48200182629b"} Sep 30 07:46:48 crc kubenswrapper[4760]: I0930 07:46:48.106278 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-69f49" podStartSLOduration=2.053902436 podStartE2EDuration="9.106258437s" podCreationTimestamp="2025-09-30 07:46:39 +0000 UTC" firstStartedPulling="2025-09-30 07:46:40.465288857 +0000 UTC m=+786.108195269" lastFinishedPulling="2025-09-30 07:46:47.517644818 +0000 UTC m=+793.160551270" observedRunningTime="2025-09-30 07:46:48.104722248 +0000 UTC m=+793.747628660" watchObservedRunningTime="2025-09-30 07:46:48.106258437 +0000 UTC m=+793.749164849" Sep 30 07:46:49 crc kubenswrapper[4760]: I0930 07:46:49.101546 4760 generic.go:334] "Generic (PLEG): container finished" podID="213dbc76-320c-463f-8133-946be9ece565" containerID="bf02174c25033a609782ebd9b3439e00d659f36ca7f074a961b2a54b8c93cb27" exitCode=0 Sep 30 07:46:49 crc kubenswrapper[4760]: I0930 07:46:49.101644 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tqbpr" event={"ID":"213dbc76-320c-463f-8133-946be9ece565","Type":"ContainerDied","Data":"bf02174c25033a609782ebd9b3439e00d659f36ca7f074a961b2a54b8c93cb27"} Sep 30 07:46:50 crc kubenswrapper[4760]: I0930 07:46:50.096410 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5d688f5ffc-ldwsd" Sep 30 07:46:50 crc kubenswrapper[4760]: I0930 07:46:50.155354 4760 generic.go:334] "Generic (PLEG): container finished" podID="213dbc76-320c-463f-8133-946be9ece565" containerID="139c1fe201f16df0ce15c995a579e9b13c500486108e6a910ef61629bd117ca0" exitCode=0 Sep 30 07:46:50 crc kubenswrapper[4760]: I0930 07:46:50.155404 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tqbpr" event={"ID":"213dbc76-320c-463f-8133-946be9ece565","Type":"ContainerDied","Data":"139c1fe201f16df0ce15c995a579e9b13c500486108e6a910ef61629bd117ca0"} Sep 30 07:46:51 crc kubenswrapper[4760]: I0930 07:46:51.167887 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tqbpr" event={"ID":"213dbc76-320c-463f-8133-946be9ece565","Type":"ContainerStarted","Data":"28346582345d3ba32233c06bf798708a9f50c16bfaba8e44cd3a92a0131ff206"} Sep 30 07:46:51 crc kubenswrapper[4760]: I0930 07:46:51.168221 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tqbpr" event={"ID":"213dbc76-320c-463f-8133-946be9ece565","Type":"ContainerStarted","Data":"a5d0463dbfca12a750f000c3bdaf21e09f48086bb9ad9aa7c47bb26aad4e6f86"} Sep 30 07:46:51 crc kubenswrapper[4760]: I0930 07:46:51.168269 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tqbpr" event={"ID":"213dbc76-320c-463f-8133-946be9ece565","Type":"ContainerStarted","Data":"6fccca24f000673935cd1b55abda4b55ab93e4aca85c821edc5722af1569b62f"} Sep 30 07:46:51 crc kubenswrapper[4760]: I0930 07:46:51.168282 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tqbpr" event={"ID":"213dbc76-320c-463f-8133-946be9ece565","Type":"ContainerStarted","Data":"7b41719bf66999fdd48fe58937f9fefcd2a7dcefe2c27541cdbc14214eb812f7"} Sep 30 07:46:51 crc kubenswrapper[4760]: I0930 07:46:51.168291 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tqbpr" event={"ID":"213dbc76-320c-463f-8133-946be9ece565","Type":"ContainerStarted","Data":"cdd74cad50eab9bafb6d0b2892fee3b4280480f6e39e1144862ccd50beb77f42"} Sep 30 07:46:51 crc kubenswrapper[4760]: I0930 07:46:51.577825 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-ldvzg" Sep 30 07:46:52 crc kubenswrapper[4760]: I0930 07:46:52.181879 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tqbpr" event={"ID":"213dbc76-320c-463f-8133-946be9ece565","Type":"ContainerStarted","Data":"5bc85e381975ebea744106b0a80f7996b8e805521821021e5702d2b7e632a3bb"} Sep 30 07:46:52 crc kubenswrapper[4760]: I0930 07:46:52.182284 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-tqbpr" Sep 30 07:46:52 crc kubenswrapper[4760]: I0930 07:46:52.222021 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-tqbpr" podStartSLOduration=5.877578823 podStartE2EDuration="13.221999156s" podCreationTimestamp="2025-09-30 07:46:39 +0000 UTC" firstStartedPulling="2025-09-30 07:46:40.153325746 +0000 UTC m=+785.796232158" lastFinishedPulling="2025-09-30 07:46:47.497746039 +0000 UTC m=+793.140652491" observedRunningTime="2025-09-30 07:46:52.221788521 +0000 UTC m=+797.864694983" watchObservedRunningTime="2025-09-30 07:46:52.221999156 +0000 UTC m=+797.864905578" Sep 30 07:46:54 crc kubenswrapper[4760]: I0930 07:46:54.590499 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-m87c5"] Sep 30 07:46:54 crc kubenswrapper[4760]: E0930 07:46:54.591158 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1902c3f9-4bf7-428b-a316-6e17ea02e678" containerName="extract-utilities" Sep 30 07:46:54 crc kubenswrapper[4760]: I0930 07:46:54.591176 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="1902c3f9-4bf7-428b-a316-6e17ea02e678" containerName="extract-utilities" Sep 30 07:46:54 crc kubenswrapper[4760]: E0930 07:46:54.591189 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1902c3f9-4bf7-428b-a316-6e17ea02e678" containerName="extract-content" Sep 30 07:46:54 crc kubenswrapper[4760]: I0930 07:46:54.591197 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="1902c3f9-4bf7-428b-a316-6e17ea02e678" containerName="extract-content" Sep 30 07:46:54 crc kubenswrapper[4760]: E0930 07:46:54.591206 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1902c3f9-4bf7-428b-a316-6e17ea02e678" containerName="registry-server" Sep 30 07:46:54 crc kubenswrapper[4760]: I0930 07:46:54.591213 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="1902c3f9-4bf7-428b-a316-6e17ea02e678" containerName="registry-server" Sep 30 07:46:54 crc kubenswrapper[4760]: I0930 07:46:54.591389 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="1902c3f9-4bf7-428b-a316-6e17ea02e678" containerName="registry-server" Sep 30 07:46:54 crc kubenswrapper[4760]: I0930 07:46:54.591909 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-m87c5" Sep 30 07:46:54 crc kubenswrapper[4760]: I0930 07:46:54.596039 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Sep 30 07:46:54 crc kubenswrapper[4760]: I0930 07:46:54.596381 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-l8bn5" Sep 30 07:46:54 crc kubenswrapper[4760]: I0930 07:46:54.596533 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Sep 30 07:46:54 crc kubenswrapper[4760]: I0930 07:46:54.623477 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-m87c5"] Sep 30 07:46:54 crc kubenswrapper[4760]: I0930 07:46:54.774195 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zwdw\" (UniqueName: \"kubernetes.io/projected/bccce8da-6083-4bba-bdb1-543b356faaa5-kube-api-access-5zwdw\") pod \"openstack-operator-index-m87c5\" (UID: \"bccce8da-6083-4bba-bdb1-543b356faaa5\") " pod="openstack-operators/openstack-operator-index-m87c5" Sep 30 07:46:54 crc kubenswrapper[4760]: I0930 07:46:54.875416 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zwdw\" (UniqueName: \"kubernetes.io/projected/bccce8da-6083-4bba-bdb1-543b356faaa5-kube-api-access-5zwdw\") pod \"openstack-operator-index-m87c5\" (UID: \"bccce8da-6083-4bba-bdb1-543b356faaa5\") " pod="openstack-operators/openstack-operator-index-m87c5" Sep 30 07:46:54 crc kubenswrapper[4760]: I0930 07:46:54.897970 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zwdw\" (UniqueName: \"kubernetes.io/projected/bccce8da-6083-4bba-bdb1-543b356faaa5-kube-api-access-5zwdw\") pod \"openstack-operator-index-m87c5\" (UID: \"bccce8da-6083-4bba-bdb1-543b356faaa5\") " pod="openstack-operators/openstack-operator-index-m87c5" Sep 30 07:46:54 crc kubenswrapper[4760]: I0930 07:46:54.909851 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-m87c5" Sep 30 07:46:55 crc kubenswrapper[4760]: I0930 07:46:55.002915 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-tqbpr" Sep 30 07:46:55 crc kubenswrapper[4760]: I0930 07:46:55.050595 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-tqbpr" Sep 30 07:46:55 crc kubenswrapper[4760]: I0930 07:46:55.383689 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-m87c5"] Sep 30 07:46:55 crc kubenswrapper[4760]: W0930 07:46:55.395525 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbccce8da_6083_4bba_bdb1_543b356faaa5.slice/crio-6b0af8e97569d9ce9b718b87f13060c67711316bb9ce6b1728b00fdd77cb3804 WatchSource:0}: Error finding container 6b0af8e97569d9ce9b718b87f13060c67711316bb9ce6b1728b00fdd77cb3804: Status 404 returned error can't find the container with id 6b0af8e97569d9ce9b718b87f13060c67711316bb9ce6b1728b00fdd77cb3804 Sep 30 07:46:56 crc kubenswrapper[4760]: I0930 07:46:56.211282 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-m87c5" event={"ID":"bccce8da-6083-4bba-bdb1-543b356faaa5","Type":"ContainerStarted","Data":"6b0af8e97569d9ce9b718b87f13060c67711316bb9ce6b1728b00fdd77cb3804"} Sep 30 07:46:57 crc kubenswrapper[4760]: I0930 07:46:57.977032 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-m87c5"] Sep 30 07:46:58 crc kubenswrapper[4760]: I0930 07:46:58.583342 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-hpsst"] Sep 30 07:46:58 crc kubenswrapper[4760]: I0930 07:46:58.584373 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hpsst" Sep 30 07:46:58 crc kubenswrapper[4760]: I0930 07:46:58.592793 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hpsst"] Sep 30 07:46:58 crc kubenswrapper[4760]: I0930 07:46:58.740699 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9cxm\" (UniqueName: \"kubernetes.io/projected/c076d5bf-bc69-4e76-b891-4d5c4387d68c-kube-api-access-l9cxm\") pod \"openstack-operator-index-hpsst\" (UID: \"c076d5bf-bc69-4e76-b891-4d5c4387d68c\") " pod="openstack-operators/openstack-operator-index-hpsst" Sep 30 07:46:58 crc kubenswrapper[4760]: I0930 07:46:58.842526 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9cxm\" (UniqueName: \"kubernetes.io/projected/c076d5bf-bc69-4e76-b891-4d5c4387d68c-kube-api-access-l9cxm\") pod \"openstack-operator-index-hpsst\" (UID: \"c076d5bf-bc69-4e76-b891-4d5c4387d68c\") " pod="openstack-operators/openstack-operator-index-hpsst" Sep 30 07:46:58 crc kubenswrapper[4760]: I0930 07:46:58.878738 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9cxm\" (UniqueName: \"kubernetes.io/projected/c076d5bf-bc69-4e76-b891-4d5c4387d68c-kube-api-access-l9cxm\") pod \"openstack-operator-index-hpsst\" (UID: \"c076d5bf-bc69-4e76-b891-4d5c4387d68c\") " pod="openstack-operators/openstack-operator-index-hpsst" Sep 30 07:46:58 crc kubenswrapper[4760]: I0930 07:46:58.928429 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hpsst" Sep 30 07:47:00 crc kubenswrapper[4760]: I0930 07:47:00.005131 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-tqbpr" Sep 30 07:47:00 crc kubenswrapper[4760]: I0930 07:47:00.017858 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-69f49" Sep 30 07:47:00 crc kubenswrapper[4760]: I0930 07:47:00.569846 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hpsst"] Sep 30 07:47:00 crc kubenswrapper[4760]: W0930 07:47:00.575848 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc076d5bf_bc69_4e76_b891_4d5c4387d68c.slice/crio-43a73fb8386e76330f2ec502a32c71cfb6d22c361119e5dbfe1aeb2e1b5766b6 WatchSource:0}: Error finding container 43a73fb8386e76330f2ec502a32c71cfb6d22c361119e5dbfe1aeb2e1b5766b6: Status 404 returned error can't find the container with id 43a73fb8386e76330f2ec502a32c71cfb6d22c361119e5dbfe1aeb2e1b5766b6 Sep 30 07:47:01 crc kubenswrapper[4760]: I0930 07:47:01.245632 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hpsst" event={"ID":"c076d5bf-bc69-4e76-b891-4d5c4387d68c","Type":"ContainerStarted","Data":"a70377a2f9573c4bc219f5e937f694088351e540d868bc91ffb146ec2f44f0a9"} Sep 30 07:47:01 crc kubenswrapper[4760]: I0930 07:47:01.246048 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hpsst" event={"ID":"c076d5bf-bc69-4e76-b891-4d5c4387d68c","Type":"ContainerStarted","Data":"43a73fb8386e76330f2ec502a32c71cfb6d22c361119e5dbfe1aeb2e1b5766b6"} Sep 30 07:47:01 crc kubenswrapper[4760]: I0930 07:47:01.247564 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-m87c5" event={"ID":"bccce8da-6083-4bba-bdb1-543b356faaa5","Type":"ContainerStarted","Data":"9d6309df839e253c959247bfde738ca5b998286f580f996baffc51c8a0826f9b"} Sep 30 07:47:01 crc kubenswrapper[4760]: I0930 07:47:01.247757 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-m87c5" podUID="bccce8da-6083-4bba-bdb1-543b356faaa5" containerName="registry-server" containerID="cri-o://9d6309df839e253c959247bfde738ca5b998286f580f996baffc51c8a0826f9b" gracePeriod=2 Sep 30 07:47:01 crc kubenswrapper[4760]: I0930 07:47:01.267128 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-hpsst" podStartSLOduration=3.1970509209999998 podStartE2EDuration="3.267104991s" podCreationTimestamp="2025-09-30 07:46:58 +0000 UTC" firstStartedPulling="2025-09-30 07:47:00.582006066 +0000 UTC m=+806.224912488" lastFinishedPulling="2025-09-30 07:47:00.652060136 +0000 UTC m=+806.294966558" observedRunningTime="2025-09-30 07:47:01.26316116 +0000 UTC m=+806.906067612" watchObservedRunningTime="2025-09-30 07:47:01.267104991 +0000 UTC m=+806.910011443" Sep 30 07:47:01 crc kubenswrapper[4760]: I0930 07:47:01.286238 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-m87c5" podStartSLOduration=2.341097281 podStartE2EDuration="7.286208329s" podCreationTimestamp="2025-09-30 07:46:54 +0000 UTC" firstStartedPulling="2025-09-30 07:46:55.396940747 +0000 UTC m=+801.039847159" lastFinishedPulling="2025-09-30 07:47:00.342051795 +0000 UTC m=+805.984958207" observedRunningTime="2025-09-30 07:47:01.279742444 +0000 UTC m=+806.922648866" watchObservedRunningTime="2025-09-30 07:47:01.286208329 +0000 UTC m=+806.929114801" Sep 30 07:47:01 crc kubenswrapper[4760]: I0930 07:47:01.661232 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-m87c5" Sep 30 07:47:01 crc kubenswrapper[4760]: I0930 07:47:01.789735 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zwdw\" (UniqueName: \"kubernetes.io/projected/bccce8da-6083-4bba-bdb1-543b356faaa5-kube-api-access-5zwdw\") pod \"bccce8da-6083-4bba-bdb1-543b356faaa5\" (UID: \"bccce8da-6083-4bba-bdb1-543b356faaa5\") " Sep 30 07:47:01 crc kubenswrapper[4760]: I0930 07:47:01.797546 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bccce8da-6083-4bba-bdb1-543b356faaa5-kube-api-access-5zwdw" (OuterVolumeSpecName: "kube-api-access-5zwdw") pod "bccce8da-6083-4bba-bdb1-543b356faaa5" (UID: "bccce8da-6083-4bba-bdb1-543b356faaa5"). InnerVolumeSpecName "kube-api-access-5zwdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:47:01 crc kubenswrapper[4760]: I0930 07:47:01.892529 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zwdw\" (UniqueName: \"kubernetes.io/projected/bccce8da-6083-4bba-bdb1-543b356faaa5-kube-api-access-5zwdw\") on node \"crc\" DevicePath \"\"" Sep 30 07:47:02 crc kubenswrapper[4760]: I0930 07:47:02.256119 4760 generic.go:334] "Generic (PLEG): container finished" podID="bccce8da-6083-4bba-bdb1-543b356faaa5" containerID="9d6309df839e253c959247bfde738ca5b998286f580f996baffc51c8a0826f9b" exitCode=0 Sep 30 07:47:02 crc kubenswrapper[4760]: I0930 07:47:02.256170 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-m87c5" Sep 30 07:47:02 crc kubenswrapper[4760]: I0930 07:47:02.256221 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-m87c5" event={"ID":"bccce8da-6083-4bba-bdb1-543b356faaa5","Type":"ContainerDied","Data":"9d6309df839e253c959247bfde738ca5b998286f580f996baffc51c8a0826f9b"} Sep 30 07:47:02 crc kubenswrapper[4760]: I0930 07:47:02.256280 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-m87c5" event={"ID":"bccce8da-6083-4bba-bdb1-543b356faaa5","Type":"ContainerDied","Data":"6b0af8e97569d9ce9b718b87f13060c67711316bb9ce6b1728b00fdd77cb3804"} Sep 30 07:47:02 crc kubenswrapper[4760]: I0930 07:47:02.256315 4760 scope.go:117] "RemoveContainer" containerID="9d6309df839e253c959247bfde738ca5b998286f580f996baffc51c8a0826f9b" Sep 30 07:47:02 crc kubenswrapper[4760]: I0930 07:47:02.277024 4760 scope.go:117] "RemoveContainer" containerID="9d6309df839e253c959247bfde738ca5b998286f580f996baffc51c8a0826f9b" Sep 30 07:47:02 crc kubenswrapper[4760]: E0930 07:47:02.277462 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d6309df839e253c959247bfde738ca5b998286f580f996baffc51c8a0826f9b\": container with ID starting with 9d6309df839e253c959247bfde738ca5b998286f580f996baffc51c8a0826f9b not found: ID does not exist" containerID="9d6309df839e253c959247bfde738ca5b998286f580f996baffc51c8a0826f9b" Sep 30 07:47:02 crc kubenswrapper[4760]: I0930 07:47:02.277507 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d6309df839e253c959247bfde738ca5b998286f580f996baffc51c8a0826f9b"} err="failed to get container status \"9d6309df839e253c959247bfde738ca5b998286f580f996baffc51c8a0826f9b\": rpc error: code = NotFound desc = could not find container \"9d6309df839e253c959247bfde738ca5b998286f580f996baffc51c8a0826f9b\": container with ID starting with 9d6309df839e253c959247bfde738ca5b998286f580f996baffc51c8a0826f9b not found: ID does not exist" Sep 30 07:47:02 crc kubenswrapper[4760]: I0930 07:47:02.289534 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-m87c5"] Sep 30 07:47:02 crc kubenswrapper[4760]: I0930 07:47:02.294056 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-m87c5"] Sep 30 07:47:02 crc kubenswrapper[4760]: I0930 07:47:02.788739 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cqx2d"] Sep 30 07:47:02 crc kubenswrapper[4760]: E0930 07:47:02.789008 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bccce8da-6083-4bba-bdb1-543b356faaa5" containerName="registry-server" Sep 30 07:47:02 crc kubenswrapper[4760]: I0930 07:47:02.789038 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="bccce8da-6083-4bba-bdb1-543b356faaa5" containerName="registry-server" Sep 30 07:47:02 crc kubenswrapper[4760]: I0930 07:47:02.789151 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="bccce8da-6083-4bba-bdb1-543b356faaa5" containerName="registry-server" Sep 30 07:47:02 crc kubenswrapper[4760]: I0930 07:47:02.789949 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cqx2d" Sep 30 07:47:02 crc kubenswrapper[4760]: I0930 07:47:02.805571 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cqx2d"] Sep 30 07:47:02 crc kubenswrapper[4760]: I0930 07:47:02.906621 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed7893e4-b7e0-452d-8ddb-aca13a9114de-catalog-content\") pod \"community-operators-cqx2d\" (UID: \"ed7893e4-b7e0-452d-8ddb-aca13a9114de\") " pod="openshift-marketplace/community-operators-cqx2d" Sep 30 07:47:02 crc kubenswrapper[4760]: I0930 07:47:02.907084 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42m9n\" (UniqueName: \"kubernetes.io/projected/ed7893e4-b7e0-452d-8ddb-aca13a9114de-kube-api-access-42m9n\") pod \"community-operators-cqx2d\" (UID: \"ed7893e4-b7e0-452d-8ddb-aca13a9114de\") " pod="openshift-marketplace/community-operators-cqx2d" Sep 30 07:47:02 crc kubenswrapper[4760]: I0930 07:47:02.907143 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed7893e4-b7e0-452d-8ddb-aca13a9114de-utilities\") pod \"community-operators-cqx2d\" (UID: \"ed7893e4-b7e0-452d-8ddb-aca13a9114de\") " pod="openshift-marketplace/community-operators-cqx2d" Sep 30 07:47:03 crc kubenswrapper[4760]: I0930 07:47:03.008548 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42m9n\" (UniqueName: \"kubernetes.io/projected/ed7893e4-b7e0-452d-8ddb-aca13a9114de-kube-api-access-42m9n\") pod \"community-operators-cqx2d\" (UID: \"ed7893e4-b7e0-452d-8ddb-aca13a9114de\") " pod="openshift-marketplace/community-operators-cqx2d" Sep 30 07:47:03 crc kubenswrapper[4760]: I0930 07:47:03.008616 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed7893e4-b7e0-452d-8ddb-aca13a9114de-utilities\") pod \"community-operators-cqx2d\" (UID: \"ed7893e4-b7e0-452d-8ddb-aca13a9114de\") " pod="openshift-marketplace/community-operators-cqx2d" Sep 30 07:47:03 crc kubenswrapper[4760]: I0930 07:47:03.008645 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed7893e4-b7e0-452d-8ddb-aca13a9114de-catalog-content\") pod \"community-operators-cqx2d\" (UID: \"ed7893e4-b7e0-452d-8ddb-aca13a9114de\") " pod="openshift-marketplace/community-operators-cqx2d" Sep 30 07:47:03 crc kubenswrapper[4760]: I0930 07:47:03.009268 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed7893e4-b7e0-452d-8ddb-aca13a9114de-utilities\") pod \"community-operators-cqx2d\" (UID: \"ed7893e4-b7e0-452d-8ddb-aca13a9114de\") " pod="openshift-marketplace/community-operators-cqx2d" Sep 30 07:47:03 crc kubenswrapper[4760]: I0930 07:47:03.009362 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed7893e4-b7e0-452d-8ddb-aca13a9114de-catalog-content\") pod \"community-operators-cqx2d\" (UID: \"ed7893e4-b7e0-452d-8ddb-aca13a9114de\") " pod="openshift-marketplace/community-operators-cqx2d" Sep 30 07:47:03 crc kubenswrapper[4760]: I0930 07:47:03.031242 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42m9n\" (UniqueName: \"kubernetes.io/projected/ed7893e4-b7e0-452d-8ddb-aca13a9114de-kube-api-access-42m9n\") pod \"community-operators-cqx2d\" (UID: \"ed7893e4-b7e0-452d-8ddb-aca13a9114de\") " pod="openshift-marketplace/community-operators-cqx2d" Sep 30 07:47:03 crc kubenswrapper[4760]: I0930 07:47:03.076884 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bccce8da-6083-4bba-bdb1-543b356faaa5" path="/var/lib/kubelet/pods/bccce8da-6083-4bba-bdb1-543b356faaa5/volumes" Sep 30 07:47:03 crc kubenswrapper[4760]: I0930 07:47:03.114793 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cqx2d" Sep 30 07:47:03 crc kubenswrapper[4760]: I0930 07:47:03.580267 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cqx2d"] Sep 30 07:47:04 crc kubenswrapper[4760]: I0930 07:47:04.278687 4760 generic.go:334] "Generic (PLEG): container finished" podID="ed7893e4-b7e0-452d-8ddb-aca13a9114de" containerID="3c9f371f1bd02de194f6905563a686134426ecaa93d602287e258a4d42bb1fd3" exitCode=0 Sep 30 07:47:04 crc kubenswrapper[4760]: I0930 07:47:04.278724 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cqx2d" event={"ID":"ed7893e4-b7e0-452d-8ddb-aca13a9114de","Type":"ContainerDied","Data":"3c9f371f1bd02de194f6905563a686134426ecaa93d602287e258a4d42bb1fd3"} Sep 30 07:47:04 crc kubenswrapper[4760]: I0930 07:47:04.278747 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cqx2d" event={"ID":"ed7893e4-b7e0-452d-8ddb-aca13a9114de","Type":"ContainerStarted","Data":"801fb6340647528c6ee33187d8eac3cce6381f67af19982c71b69f7e4ce32322"} Sep 30 07:47:05 crc kubenswrapper[4760]: I0930 07:47:05.285225 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cqx2d" event={"ID":"ed7893e4-b7e0-452d-8ddb-aca13a9114de","Type":"ContainerStarted","Data":"c7b23d934373821e1e2272720725125bd9a2db84a503cd477a1b9c80340dc6a5"} Sep 30 07:47:06 crc kubenswrapper[4760]: I0930 07:47:06.303646 4760 generic.go:334] "Generic (PLEG): container finished" podID="ed7893e4-b7e0-452d-8ddb-aca13a9114de" containerID="c7b23d934373821e1e2272720725125bd9a2db84a503cd477a1b9c80340dc6a5" exitCode=0 Sep 30 07:47:06 crc kubenswrapper[4760]: I0930 07:47:06.303769 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cqx2d" event={"ID":"ed7893e4-b7e0-452d-8ddb-aca13a9114de","Type":"ContainerDied","Data":"c7b23d934373821e1e2272720725125bd9a2db84a503cd477a1b9c80340dc6a5"} Sep 30 07:47:07 crc kubenswrapper[4760]: I0930 07:47:07.312946 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cqx2d" event={"ID":"ed7893e4-b7e0-452d-8ddb-aca13a9114de","Type":"ContainerStarted","Data":"58d03625fa9da7c89b76b83021d982095f0a5ee129885cbd365a30db0ea027a4"} Sep 30 07:47:07 crc kubenswrapper[4760]: I0930 07:47:07.333027 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cqx2d" podStartSLOduration=2.902590709 podStartE2EDuration="5.333007127s" podCreationTimestamp="2025-09-30 07:47:02 +0000 UTC" firstStartedPulling="2025-09-30 07:47:04.280002001 +0000 UTC m=+809.922908413" lastFinishedPulling="2025-09-30 07:47:06.710418409 +0000 UTC m=+812.353324831" observedRunningTime="2025-09-30 07:47:07.331536799 +0000 UTC m=+812.974443211" watchObservedRunningTime="2025-09-30 07:47:07.333007127 +0000 UTC m=+812.975913539" Sep 30 07:47:08 crc kubenswrapper[4760]: I0930 07:47:08.928599 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-hpsst" Sep 30 07:47:08 crc kubenswrapper[4760]: I0930 07:47:08.929467 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-hpsst" Sep 30 07:47:08 crc kubenswrapper[4760]: I0930 07:47:08.974804 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-hpsst" Sep 30 07:47:09 crc kubenswrapper[4760]: I0930 07:47:09.361628 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-hpsst" Sep 30 07:47:10 crc kubenswrapper[4760]: I0930 07:47:10.830706 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh"] Sep 30 07:47:10 crc kubenswrapper[4760]: I0930 07:47:10.833501 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh" Sep 30 07:47:10 crc kubenswrapper[4760]: I0930 07:47:10.836050 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-kpgl5" Sep 30 07:47:10 crc kubenswrapper[4760]: I0930 07:47:10.845320 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh"] Sep 30 07:47:10 crc kubenswrapper[4760]: I0930 07:47:10.917162 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/59175508-6983-4846-a813-05181244346d-bundle\") pod \"2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh\" (UID: \"59175508-6983-4846-a813-05181244346d\") " pod="openstack-operators/2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh" Sep 30 07:47:10 crc kubenswrapper[4760]: I0930 07:47:10.917237 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4h72\" (UniqueName: \"kubernetes.io/projected/59175508-6983-4846-a813-05181244346d-kube-api-access-k4h72\") pod \"2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh\" (UID: \"59175508-6983-4846-a813-05181244346d\") " pod="openstack-operators/2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh" Sep 30 07:47:10 crc kubenswrapper[4760]: I0930 07:47:10.917657 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/59175508-6983-4846-a813-05181244346d-util\") pod \"2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh\" (UID: \"59175508-6983-4846-a813-05181244346d\") " pod="openstack-operators/2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh" Sep 30 07:47:11 crc kubenswrapper[4760]: I0930 07:47:11.018486 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/59175508-6983-4846-a813-05181244346d-util\") pod \"2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh\" (UID: \"59175508-6983-4846-a813-05181244346d\") " pod="openstack-operators/2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh" Sep 30 07:47:11 crc kubenswrapper[4760]: I0930 07:47:11.018610 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/59175508-6983-4846-a813-05181244346d-bundle\") pod \"2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh\" (UID: \"59175508-6983-4846-a813-05181244346d\") " pod="openstack-operators/2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh" Sep 30 07:47:11 crc kubenswrapper[4760]: I0930 07:47:11.018658 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4h72\" (UniqueName: \"kubernetes.io/projected/59175508-6983-4846-a813-05181244346d-kube-api-access-k4h72\") pod \"2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh\" (UID: \"59175508-6983-4846-a813-05181244346d\") " pod="openstack-operators/2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh" Sep 30 07:47:11 crc kubenswrapper[4760]: I0930 07:47:11.019591 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/59175508-6983-4846-a813-05181244346d-util\") pod \"2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh\" (UID: \"59175508-6983-4846-a813-05181244346d\") " pod="openstack-operators/2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh" Sep 30 07:47:11 crc kubenswrapper[4760]: I0930 07:47:11.019690 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/59175508-6983-4846-a813-05181244346d-bundle\") pod \"2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh\" (UID: \"59175508-6983-4846-a813-05181244346d\") " pod="openstack-operators/2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh" Sep 30 07:47:11 crc kubenswrapper[4760]: I0930 07:47:11.054112 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4h72\" (UniqueName: \"kubernetes.io/projected/59175508-6983-4846-a813-05181244346d-kube-api-access-k4h72\") pod \"2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh\" (UID: \"59175508-6983-4846-a813-05181244346d\") " pod="openstack-operators/2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh" Sep 30 07:47:11 crc kubenswrapper[4760]: I0930 07:47:11.168197 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh" Sep 30 07:47:11 crc kubenswrapper[4760]: I0930 07:47:11.644046 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh"] Sep 30 07:47:11 crc kubenswrapper[4760]: W0930 07:47:11.647967 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59175508_6983_4846_a813_05181244346d.slice/crio-8289a718a383020cf6ee4b260bb80a584ce35dbe4ba898d7ad1e455754ad0925 WatchSource:0}: Error finding container 8289a718a383020cf6ee4b260bb80a584ce35dbe4ba898d7ad1e455754ad0925: Status 404 returned error can't find the container with id 8289a718a383020cf6ee4b260bb80a584ce35dbe4ba898d7ad1e455754ad0925 Sep 30 07:47:12 crc kubenswrapper[4760]: I0930 07:47:12.346984 4760 generic.go:334] "Generic (PLEG): container finished" podID="59175508-6983-4846-a813-05181244346d" containerID="86d6eb7aa5a132e7aacc436c9c174c23b8979d43f1700551ef5cbcb49d02d8c1" exitCode=0 Sep 30 07:47:12 crc kubenswrapper[4760]: I0930 07:47:12.347049 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh" event={"ID":"59175508-6983-4846-a813-05181244346d","Type":"ContainerDied","Data":"86d6eb7aa5a132e7aacc436c9c174c23b8979d43f1700551ef5cbcb49d02d8c1"} Sep 30 07:47:12 crc kubenswrapper[4760]: I0930 07:47:12.347105 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh" event={"ID":"59175508-6983-4846-a813-05181244346d","Type":"ContainerStarted","Data":"8289a718a383020cf6ee4b260bb80a584ce35dbe4ba898d7ad1e455754ad0925"} Sep 30 07:47:13 crc kubenswrapper[4760]: I0930 07:47:13.115150 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cqx2d" Sep 30 07:47:13 crc kubenswrapper[4760]: I0930 07:47:13.115727 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cqx2d" Sep 30 07:47:13 crc kubenswrapper[4760]: I0930 07:47:13.170153 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cqx2d" Sep 30 07:47:13 crc kubenswrapper[4760]: I0930 07:47:13.354416 4760 generic.go:334] "Generic (PLEG): container finished" podID="59175508-6983-4846-a813-05181244346d" containerID="e4baae1ceb2917d9d2073cd00bc2d0f114c6d85a05a518ec3bae739024120ada" exitCode=0 Sep 30 07:47:13 crc kubenswrapper[4760]: I0930 07:47:13.354480 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh" event={"ID":"59175508-6983-4846-a813-05181244346d","Type":"ContainerDied","Data":"e4baae1ceb2917d9d2073cd00bc2d0f114c6d85a05a518ec3bae739024120ada"} Sep 30 07:47:13 crc kubenswrapper[4760]: I0930 07:47:13.413125 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cqx2d" Sep 30 07:47:14 crc kubenswrapper[4760]: I0930 07:47:14.365217 4760 generic.go:334] "Generic (PLEG): container finished" podID="59175508-6983-4846-a813-05181244346d" containerID="30a3a0df63297cda5469891459892e9c10acc224a02874c29aa4a298844c7953" exitCode=0 Sep 30 07:47:14 crc kubenswrapper[4760]: I0930 07:47:14.365352 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh" event={"ID":"59175508-6983-4846-a813-05181244346d","Type":"ContainerDied","Data":"30a3a0df63297cda5469891459892e9c10acc224a02874c29aa4a298844c7953"} Sep 30 07:47:15 crc kubenswrapper[4760]: I0930 07:47:15.765565 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh" Sep 30 07:47:15 crc kubenswrapper[4760]: I0930 07:47:15.889502 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/59175508-6983-4846-a813-05181244346d-bundle\") pod \"59175508-6983-4846-a813-05181244346d\" (UID: \"59175508-6983-4846-a813-05181244346d\") " Sep 30 07:47:15 crc kubenswrapper[4760]: I0930 07:47:15.889800 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4h72\" (UniqueName: \"kubernetes.io/projected/59175508-6983-4846-a813-05181244346d-kube-api-access-k4h72\") pod \"59175508-6983-4846-a813-05181244346d\" (UID: \"59175508-6983-4846-a813-05181244346d\") " Sep 30 07:47:15 crc kubenswrapper[4760]: I0930 07:47:15.889951 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/59175508-6983-4846-a813-05181244346d-util\") pod \"59175508-6983-4846-a813-05181244346d\" (UID: \"59175508-6983-4846-a813-05181244346d\") " Sep 30 07:47:15 crc kubenswrapper[4760]: I0930 07:47:15.891207 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59175508-6983-4846-a813-05181244346d-bundle" (OuterVolumeSpecName: "bundle") pod "59175508-6983-4846-a813-05181244346d" (UID: "59175508-6983-4846-a813-05181244346d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:47:15 crc kubenswrapper[4760]: I0930 07:47:15.898399 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59175508-6983-4846-a813-05181244346d-kube-api-access-k4h72" (OuterVolumeSpecName: "kube-api-access-k4h72") pod "59175508-6983-4846-a813-05181244346d" (UID: "59175508-6983-4846-a813-05181244346d"). InnerVolumeSpecName "kube-api-access-k4h72". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:47:15 crc kubenswrapper[4760]: I0930 07:47:15.905926 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59175508-6983-4846-a813-05181244346d-util" (OuterVolumeSpecName: "util") pod "59175508-6983-4846-a813-05181244346d" (UID: "59175508-6983-4846-a813-05181244346d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:47:15 crc kubenswrapper[4760]: I0930 07:47:15.990894 4760 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/59175508-6983-4846-a813-05181244346d-util\") on node \"crc\" DevicePath \"\"" Sep 30 07:47:15 crc kubenswrapper[4760]: I0930 07:47:15.990923 4760 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/59175508-6983-4846-a813-05181244346d-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:47:15 crc kubenswrapper[4760]: I0930 07:47:15.990935 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4h72\" (UniqueName: \"kubernetes.io/projected/59175508-6983-4846-a813-05181244346d-kube-api-access-k4h72\") on node \"crc\" DevicePath \"\"" Sep 30 07:47:16 crc kubenswrapper[4760]: I0930 07:47:16.385734 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh" event={"ID":"59175508-6983-4846-a813-05181244346d","Type":"ContainerDied","Data":"8289a718a383020cf6ee4b260bb80a584ce35dbe4ba898d7ad1e455754ad0925"} Sep 30 07:47:16 crc kubenswrapper[4760]: I0930 07:47:16.385782 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8289a718a383020cf6ee4b260bb80a584ce35dbe4ba898d7ad1e455754ad0925" Sep 30 07:47:16 crc kubenswrapper[4760]: I0930 07:47:16.386490 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh" Sep 30 07:47:16 crc kubenswrapper[4760]: I0930 07:47:16.775101 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cqx2d"] Sep 30 07:47:16 crc kubenswrapper[4760]: I0930 07:47:16.775811 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cqx2d" podUID="ed7893e4-b7e0-452d-8ddb-aca13a9114de" containerName="registry-server" containerID="cri-o://58d03625fa9da7c89b76b83021d982095f0a5ee129885cbd365a30db0ea027a4" gracePeriod=2 Sep 30 07:47:17 crc kubenswrapper[4760]: I0930 07:47:17.234179 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cqx2d" Sep 30 07:47:17 crc kubenswrapper[4760]: I0930 07:47:17.308203 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed7893e4-b7e0-452d-8ddb-aca13a9114de-utilities\") pod \"ed7893e4-b7e0-452d-8ddb-aca13a9114de\" (UID: \"ed7893e4-b7e0-452d-8ddb-aca13a9114de\") " Sep 30 07:47:17 crc kubenswrapper[4760]: I0930 07:47:17.308327 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed7893e4-b7e0-452d-8ddb-aca13a9114de-catalog-content\") pod \"ed7893e4-b7e0-452d-8ddb-aca13a9114de\" (UID: \"ed7893e4-b7e0-452d-8ddb-aca13a9114de\") " Sep 30 07:47:17 crc kubenswrapper[4760]: I0930 07:47:17.308408 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42m9n\" (UniqueName: \"kubernetes.io/projected/ed7893e4-b7e0-452d-8ddb-aca13a9114de-kube-api-access-42m9n\") pod \"ed7893e4-b7e0-452d-8ddb-aca13a9114de\" (UID: \"ed7893e4-b7e0-452d-8ddb-aca13a9114de\") " Sep 30 07:47:17 crc kubenswrapper[4760]: I0930 07:47:17.309917 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed7893e4-b7e0-452d-8ddb-aca13a9114de-utilities" (OuterVolumeSpecName: "utilities") pod "ed7893e4-b7e0-452d-8ddb-aca13a9114de" (UID: "ed7893e4-b7e0-452d-8ddb-aca13a9114de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:47:17 crc kubenswrapper[4760]: I0930 07:47:17.317416 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed7893e4-b7e0-452d-8ddb-aca13a9114de-kube-api-access-42m9n" (OuterVolumeSpecName: "kube-api-access-42m9n") pod "ed7893e4-b7e0-452d-8ddb-aca13a9114de" (UID: "ed7893e4-b7e0-452d-8ddb-aca13a9114de"). InnerVolumeSpecName "kube-api-access-42m9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:47:17 crc kubenswrapper[4760]: I0930 07:47:17.366740 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed7893e4-b7e0-452d-8ddb-aca13a9114de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed7893e4-b7e0-452d-8ddb-aca13a9114de" (UID: "ed7893e4-b7e0-452d-8ddb-aca13a9114de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:47:17 crc kubenswrapper[4760]: I0930 07:47:17.398626 4760 generic.go:334] "Generic (PLEG): container finished" podID="ed7893e4-b7e0-452d-8ddb-aca13a9114de" containerID="58d03625fa9da7c89b76b83021d982095f0a5ee129885cbd365a30db0ea027a4" exitCode=0 Sep 30 07:47:17 crc kubenswrapper[4760]: I0930 07:47:17.398673 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cqx2d" event={"ID":"ed7893e4-b7e0-452d-8ddb-aca13a9114de","Type":"ContainerDied","Data":"58d03625fa9da7c89b76b83021d982095f0a5ee129885cbd365a30db0ea027a4"} Sep 30 07:47:17 crc kubenswrapper[4760]: I0930 07:47:17.398699 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cqx2d" event={"ID":"ed7893e4-b7e0-452d-8ddb-aca13a9114de","Type":"ContainerDied","Data":"801fb6340647528c6ee33187d8eac3cce6381f67af19982c71b69f7e4ce32322"} Sep 30 07:47:17 crc kubenswrapper[4760]: I0930 07:47:17.398714 4760 scope.go:117] "RemoveContainer" containerID="58d03625fa9da7c89b76b83021d982095f0a5ee129885cbd365a30db0ea027a4" Sep 30 07:47:17 crc kubenswrapper[4760]: I0930 07:47:17.398752 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cqx2d" Sep 30 07:47:17 crc kubenswrapper[4760]: I0930 07:47:17.410424 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed7893e4-b7e0-452d-8ddb-aca13a9114de-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:47:17 crc kubenswrapper[4760]: I0930 07:47:17.410478 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed7893e4-b7e0-452d-8ddb-aca13a9114de-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:47:17 crc kubenswrapper[4760]: I0930 07:47:17.410499 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42m9n\" (UniqueName: \"kubernetes.io/projected/ed7893e4-b7e0-452d-8ddb-aca13a9114de-kube-api-access-42m9n\") on node \"crc\" DevicePath \"\"" Sep 30 07:47:17 crc kubenswrapper[4760]: I0930 07:47:17.437555 4760 scope.go:117] "RemoveContainer" containerID="c7b23d934373821e1e2272720725125bd9a2db84a503cd477a1b9c80340dc6a5" Sep 30 07:47:17 crc kubenswrapper[4760]: I0930 07:47:17.457928 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cqx2d"] Sep 30 07:47:17 crc kubenswrapper[4760]: I0930 07:47:17.463676 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cqx2d"] Sep 30 07:47:17 crc kubenswrapper[4760]: I0930 07:47:17.486060 4760 scope.go:117] "RemoveContainer" containerID="3c9f371f1bd02de194f6905563a686134426ecaa93d602287e258a4d42bb1fd3" Sep 30 07:47:17 crc kubenswrapper[4760]: I0930 07:47:17.508556 4760 scope.go:117] "RemoveContainer" containerID="58d03625fa9da7c89b76b83021d982095f0a5ee129885cbd365a30db0ea027a4" Sep 30 07:47:17 crc kubenswrapper[4760]: E0930 07:47:17.509537 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58d03625fa9da7c89b76b83021d982095f0a5ee129885cbd365a30db0ea027a4\": container with ID starting with 58d03625fa9da7c89b76b83021d982095f0a5ee129885cbd365a30db0ea027a4 not found: ID does not exist" containerID="58d03625fa9da7c89b76b83021d982095f0a5ee129885cbd365a30db0ea027a4" Sep 30 07:47:17 crc kubenswrapper[4760]: I0930 07:47:17.509606 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d03625fa9da7c89b76b83021d982095f0a5ee129885cbd365a30db0ea027a4"} err="failed to get container status \"58d03625fa9da7c89b76b83021d982095f0a5ee129885cbd365a30db0ea027a4\": rpc error: code = NotFound desc = could not find container \"58d03625fa9da7c89b76b83021d982095f0a5ee129885cbd365a30db0ea027a4\": container with ID starting with 58d03625fa9da7c89b76b83021d982095f0a5ee129885cbd365a30db0ea027a4 not found: ID does not exist" Sep 30 07:47:17 crc kubenswrapper[4760]: I0930 07:47:17.509641 4760 scope.go:117] "RemoveContainer" containerID="c7b23d934373821e1e2272720725125bd9a2db84a503cd477a1b9c80340dc6a5" Sep 30 07:47:17 crc kubenswrapper[4760]: E0930 07:47:17.510161 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7b23d934373821e1e2272720725125bd9a2db84a503cd477a1b9c80340dc6a5\": container with ID starting with c7b23d934373821e1e2272720725125bd9a2db84a503cd477a1b9c80340dc6a5 not found: ID does not exist" containerID="c7b23d934373821e1e2272720725125bd9a2db84a503cd477a1b9c80340dc6a5" Sep 30 07:47:17 crc kubenswrapper[4760]: I0930 07:47:17.510229 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7b23d934373821e1e2272720725125bd9a2db84a503cd477a1b9c80340dc6a5"} err="failed to get container status \"c7b23d934373821e1e2272720725125bd9a2db84a503cd477a1b9c80340dc6a5\": rpc error: code = NotFound desc = could not find container \"c7b23d934373821e1e2272720725125bd9a2db84a503cd477a1b9c80340dc6a5\": container with ID starting with c7b23d934373821e1e2272720725125bd9a2db84a503cd477a1b9c80340dc6a5 not found: ID does not exist" Sep 30 07:47:17 crc kubenswrapper[4760]: I0930 07:47:17.510273 4760 scope.go:117] "RemoveContainer" containerID="3c9f371f1bd02de194f6905563a686134426ecaa93d602287e258a4d42bb1fd3" Sep 30 07:47:17 crc kubenswrapper[4760]: E0930 07:47:17.510824 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c9f371f1bd02de194f6905563a686134426ecaa93d602287e258a4d42bb1fd3\": container with ID starting with 3c9f371f1bd02de194f6905563a686134426ecaa93d602287e258a4d42bb1fd3 not found: ID does not exist" containerID="3c9f371f1bd02de194f6905563a686134426ecaa93d602287e258a4d42bb1fd3" Sep 30 07:47:17 crc kubenswrapper[4760]: I0930 07:47:17.510852 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c9f371f1bd02de194f6905563a686134426ecaa93d602287e258a4d42bb1fd3"} err="failed to get container status \"3c9f371f1bd02de194f6905563a686134426ecaa93d602287e258a4d42bb1fd3\": rpc error: code = NotFound desc = could not find container \"3c9f371f1bd02de194f6905563a686134426ecaa93d602287e258a4d42bb1fd3\": container with ID starting with 3c9f371f1bd02de194f6905563a686134426ecaa93d602287e258a4d42bb1fd3 not found: ID does not exist" Sep 30 07:47:19 crc kubenswrapper[4760]: I0930 07:47:19.080528 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed7893e4-b7e0-452d-8ddb-aca13a9114de" path="/var/lib/kubelet/pods/ed7893e4-b7e0-452d-8ddb-aca13a9114de/volumes" Sep 30 07:47:22 crc kubenswrapper[4760]: I0930 07:47:22.516661 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-799b749c5f-bxgqk"] Sep 30 07:47:22 crc kubenswrapper[4760]: E0930 07:47:22.517177 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59175508-6983-4846-a813-05181244346d" containerName="util" Sep 30 07:47:22 crc kubenswrapper[4760]: I0930 07:47:22.517188 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="59175508-6983-4846-a813-05181244346d" containerName="util" Sep 30 07:47:22 crc kubenswrapper[4760]: E0930 07:47:22.517197 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59175508-6983-4846-a813-05181244346d" containerName="extract" Sep 30 07:47:22 crc kubenswrapper[4760]: I0930 07:47:22.517204 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="59175508-6983-4846-a813-05181244346d" containerName="extract" Sep 30 07:47:22 crc kubenswrapper[4760]: E0930 07:47:22.517226 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed7893e4-b7e0-452d-8ddb-aca13a9114de" containerName="registry-server" Sep 30 07:47:22 crc kubenswrapper[4760]: I0930 07:47:22.517234 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed7893e4-b7e0-452d-8ddb-aca13a9114de" containerName="registry-server" Sep 30 07:47:22 crc kubenswrapper[4760]: E0930 07:47:22.517240 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59175508-6983-4846-a813-05181244346d" containerName="pull" Sep 30 07:47:22 crc kubenswrapper[4760]: I0930 07:47:22.517246 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="59175508-6983-4846-a813-05181244346d" containerName="pull" Sep 30 07:47:22 crc kubenswrapper[4760]: E0930 07:47:22.517256 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed7893e4-b7e0-452d-8ddb-aca13a9114de" containerName="extract-utilities" Sep 30 07:47:22 crc kubenswrapper[4760]: I0930 07:47:22.517262 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed7893e4-b7e0-452d-8ddb-aca13a9114de" containerName="extract-utilities" Sep 30 07:47:22 crc kubenswrapper[4760]: E0930 07:47:22.517273 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed7893e4-b7e0-452d-8ddb-aca13a9114de" containerName="extract-content" Sep 30 07:47:22 crc kubenswrapper[4760]: I0930 07:47:22.517279 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed7893e4-b7e0-452d-8ddb-aca13a9114de" containerName="extract-content" Sep 30 07:47:22 crc kubenswrapper[4760]: I0930 07:47:22.517390 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="59175508-6983-4846-a813-05181244346d" containerName="extract" Sep 30 07:47:22 crc kubenswrapper[4760]: I0930 07:47:22.517409 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed7893e4-b7e0-452d-8ddb-aca13a9114de" containerName="registry-server" Sep 30 07:47:22 crc kubenswrapper[4760]: I0930 07:47:22.518092 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-799b749c5f-bxgqk" Sep 30 07:47:22 crc kubenswrapper[4760]: I0930 07:47:22.522566 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-rtd2c" Sep 30 07:47:22 crc kubenswrapper[4760]: I0930 07:47:22.540846 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-799b749c5f-bxgqk"] Sep 30 07:47:22 crc kubenswrapper[4760]: I0930 07:47:22.706635 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vw5p\" (UniqueName: \"kubernetes.io/projected/ecfaa27d-a6ba-432f-8a63-80706fcdf76a-kube-api-access-7vw5p\") pod \"openstack-operator-controller-operator-799b749c5f-bxgqk\" (UID: \"ecfaa27d-a6ba-432f-8a63-80706fcdf76a\") " pod="openstack-operators/openstack-operator-controller-operator-799b749c5f-bxgqk" Sep 30 07:47:22 crc kubenswrapper[4760]: I0930 07:47:22.807588 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vw5p\" (UniqueName: \"kubernetes.io/projected/ecfaa27d-a6ba-432f-8a63-80706fcdf76a-kube-api-access-7vw5p\") pod \"openstack-operator-controller-operator-799b749c5f-bxgqk\" (UID: \"ecfaa27d-a6ba-432f-8a63-80706fcdf76a\") " pod="openstack-operators/openstack-operator-controller-operator-799b749c5f-bxgqk" Sep 30 07:47:22 crc kubenswrapper[4760]: I0930 07:47:22.839298 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vw5p\" (UniqueName: \"kubernetes.io/projected/ecfaa27d-a6ba-432f-8a63-80706fcdf76a-kube-api-access-7vw5p\") pod \"openstack-operator-controller-operator-799b749c5f-bxgqk\" (UID: \"ecfaa27d-a6ba-432f-8a63-80706fcdf76a\") " pod="openstack-operators/openstack-operator-controller-operator-799b749c5f-bxgqk" Sep 30 07:47:23 crc kubenswrapper[4760]: I0930 07:47:23.133560 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-799b749c5f-bxgqk" Sep 30 07:47:23 crc kubenswrapper[4760]: I0930 07:47:23.608356 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-799b749c5f-bxgqk"] Sep 30 07:47:24 crc kubenswrapper[4760]: I0930 07:47:24.451212 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-799b749c5f-bxgqk" event={"ID":"ecfaa27d-a6ba-432f-8a63-80706fcdf76a","Type":"ContainerStarted","Data":"c9564377d33ece24ace0e827e71a52080b61c137c0d2e510d3d413bef83f8298"} Sep 30 07:47:27 crc kubenswrapper[4760]: I0930 07:47:27.473867 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-799b749c5f-bxgqk" event={"ID":"ecfaa27d-a6ba-432f-8a63-80706fcdf76a","Type":"ContainerStarted","Data":"ce9bbf25ac8c3fe54f75af79ce1812e9a626779a4334c0617034b498acfdd795"} Sep 30 07:47:30 crc kubenswrapper[4760]: I0930 07:47:30.504220 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-799b749c5f-bxgqk" event={"ID":"ecfaa27d-a6ba-432f-8a63-80706fcdf76a","Type":"ContainerStarted","Data":"2a12ee8cccc22f2d602954c3c61b195d9c389605ec10053620b1bd6f0cdc2815"} Sep 30 07:47:30 crc kubenswrapper[4760]: I0930 07:47:30.504916 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-799b749c5f-bxgqk" Sep 30 07:47:30 crc kubenswrapper[4760]: I0930 07:47:30.552589 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-799b749c5f-bxgqk" podStartSLOduration=2.6125067509999997 podStartE2EDuration="8.552562831s" podCreationTimestamp="2025-09-30 07:47:22 +0000 UTC" firstStartedPulling="2025-09-30 07:47:23.620850244 +0000 UTC m=+829.263756666" lastFinishedPulling="2025-09-30 07:47:29.560906324 +0000 UTC m=+835.203812746" observedRunningTime="2025-09-30 07:47:30.541703754 +0000 UTC m=+836.184610226" watchObservedRunningTime="2025-09-30 07:47:30.552562831 +0000 UTC m=+836.195469283" Sep 30 07:47:33 crc kubenswrapper[4760]: I0930 07:47:33.137166 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-799b749c5f-bxgqk" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.405253 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-n528w"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.406954 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n528w" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.411838 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-s4vf9"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.413365 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-s4vf9" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.413579 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-hxrkz" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.415393 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-ghqdw" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.416698 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-n528w"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.449373 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-s4vf9"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.454192 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-2zwsf"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.455504 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-2zwsf" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.458444 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-5qxdv" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.460606 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-mcrx5"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.461582 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-mcrx5" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.463841 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-4vchk" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.480796 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-2zwsf"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.481003 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfjjt\" (UniqueName: \"kubernetes.io/projected/7fdb76d3-726a-416a-9b64-df2d6a67d88a-kube-api-access-rfjjt\") pod \"cinder-operator-controller-manager-644bddb6d8-s4vf9\" (UID: \"7fdb76d3-726a-416a-9b64-df2d6a67d88a\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-s4vf9" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.481045 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lpxs\" (UniqueName: \"kubernetes.io/projected/90fe11d3-6b6b-46c3-9833-d68d080144b9-kube-api-access-8lpxs\") pod \"barbican-operator-controller-manager-6ff8b75857-n528w\" (UID: \"90fe11d3-6b6b-46c3-9833-d68d080144b9\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n528w" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.481099 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw8lf\" (UniqueName: \"kubernetes.io/projected/0673130a-0175-41b4-a8d8-188c7a39caa0-kube-api-access-jw8lf\") pod \"designate-operator-controller-manager-84f4f7b77b-2zwsf\" (UID: \"0673130a-0175-41b4-a8d8-188c7a39caa0\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-2zwsf" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.481167 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl7ft\" (UniqueName: \"kubernetes.io/projected/93eb25ad-5a9d-4044-ba79-8869b28787dd-kube-api-access-dl7ft\") pod \"glance-operator-controller-manager-84958c4d49-mcrx5\" (UID: \"93eb25ad-5a9d-4044-ba79-8869b28787dd\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-mcrx5" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.492219 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-mcrx5"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.495833 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-7brm5"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.496879 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-7brm5" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.498887 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-6zjdq" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.500393 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-7brm5"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.519829 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-j5vp7"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.521064 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-j5vp7" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.533095 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-dzlvd" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.563394 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-j5vp7"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.583041 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xzwl\" (UniqueName: \"kubernetes.io/projected/28a5f605-2c82-4747-8b3d-2704804e81ec-kube-api-access-5xzwl\") pod \"horizon-operator-controller-manager-9f4696d94-j5vp7\" (UID: \"28a5f605-2c82-4747-8b3d-2704804e81ec\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-j5vp7" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.583121 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfjjt\" (UniqueName: \"kubernetes.io/projected/7fdb76d3-726a-416a-9b64-df2d6a67d88a-kube-api-access-rfjjt\") pod \"cinder-operator-controller-manager-644bddb6d8-s4vf9\" (UID: \"7fdb76d3-726a-416a-9b64-df2d6a67d88a\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-s4vf9" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.583155 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lpxs\" (UniqueName: \"kubernetes.io/projected/90fe11d3-6b6b-46c3-9833-d68d080144b9-kube-api-access-8lpxs\") pod \"barbican-operator-controller-manager-6ff8b75857-n528w\" (UID: \"90fe11d3-6b6b-46c3-9833-d68d080144b9\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n528w" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.583224 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8s9g\" (UniqueName: \"kubernetes.io/projected/fff7432b-8ea3-4b35-8726-640f02bd8d58-kube-api-access-m8s9g\") pod \"heat-operator-controller-manager-5d889d78cf-7brm5\" (UID: \"fff7432b-8ea3-4b35-8726-640f02bd8d58\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-7brm5" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.583266 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw8lf\" (UniqueName: \"kubernetes.io/projected/0673130a-0175-41b4-a8d8-188c7a39caa0-kube-api-access-jw8lf\") pod \"designate-operator-controller-manager-84f4f7b77b-2zwsf\" (UID: \"0673130a-0175-41b4-a8d8-188c7a39caa0\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-2zwsf" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.583335 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl7ft\" (UniqueName: \"kubernetes.io/projected/93eb25ad-5a9d-4044-ba79-8869b28787dd-kube-api-access-dl7ft\") pod \"glance-operator-controller-manager-84958c4d49-mcrx5\" (UID: \"93eb25ad-5a9d-4044-ba79-8869b28787dd\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-mcrx5" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.650680 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lpxs\" (UniqueName: \"kubernetes.io/projected/90fe11d3-6b6b-46c3-9833-d68d080144b9-kube-api-access-8lpxs\") pod \"barbican-operator-controller-manager-6ff8b75857-n528w\" (UID: \"90fe11d3-6b6b-46c3-9833-d68d080144b9\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n528w" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.650721 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw8lf\" (UniqueName: \"kubernetes.io/projected/0673130a-0175-41b4-a8d8-188c7a39caa0-kube-api-access-jw8lf\") pod \"designate-operator-controller-manager-84f4f7b77b-2zwsf\" (UID: \"0673130a-0175-41b4-a8d8-188c7a39caa0\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-2zwsf" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.650924 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfjjt\" (UniqueName: \"kubernetes.io/projected/7fdb76d3-726a-416a-9b64-df2d6a67d88a-kube-api-access-rfjjt\") pod \"cinder-operator-controller-manager-644bddb6d8-s4vf9\" (UID: \"7fdb76d3-726a-416a-9b64-df2d6a67d88a\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-s4vf9" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.653390 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-nqdfs"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.654478 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-nqdfs" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.656894 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.657372 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-5lmnk" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.657845 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl7ft\" (UniqueName: \"kubernetes.io/projected/93eb25ad-5a9d-4044-ba79-8869b28787dd-kube-api-access-dl7ft\") pod \"glance-operator-controller-manager-84958c4d49-mcrx5\" (UID: \"93eb25ad-5a9d-4044-ba79-8869b28787dd\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-mcrx5" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.661363 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-nqdfs"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.681964 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-7d78g"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.683129 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-7d78g" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.683839 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b-cert\") pod \"infra-operator-controller-manager-7d857cc749-nqdfs\" (UID: \"18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-nqdfs" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.683960 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xzwl\" (UniqueName: \"kubernetes.io/projected/28a5f605-2c82-4747-8b3d-2704804e81ec-kube-api-access-5xzwl\") pod \"horizon-operator-controller-manager-9f4696d94-j5vp7\" (UID: \"28a5f605-2c82-4747-8b3d-2704804e81ec\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-j5vp7" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.684055 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8s9g\" (UniqueName: \"kubernetes.io/projected/fff7432b-8ea3-4b35-8726-640f02bd8d58-kube-api-access-m8s9g\") pod \"heat-operator-controller-manager-5d889d78cf-7brm5\" (UID: \"fff7432b-8ea3-4b35-8726-640f02bd8d58\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-7brm5" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.684136 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrrv4\" (UniqueName: \"kubernetes.io/projected/18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b-kube-api-access-rrrv4\") pod \"infra-operator-controller-manager-7d857cc749-nqdfs\" (UID: \"18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-nqdfs" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.692726 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-zdg4n" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.694176 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-rvrn9"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.698378 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-rvrn9" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.705753 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-gth69" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.715741 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-7d78g"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.728175 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-rvrn9"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.731208 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8s9g\" (UniqueName: \"kubernetes.io/projected/fff7432b-8ea3-4b35-8726-640f02bd8d58-kube-api-access-m8s9g\") pod \"heat-operator-controller-manager-5d889d78cf-7brm5\" (UID: \"fff7432b-8ea3-4b35-8726-640f02bd8d58\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-7brm5" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.734846 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xzwl\" (UniqueName: \"kubernetes.io/projected/28a5f605-2c82-4747-8b3d-2704804e81ec-kube-api-access-5xzwl\") pod \"horizon-operator-controller-manager-9f4696d94-j5vp7\" (UID: \"28a5f605-2c82-4747-8b3d-2704804e81ec\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-j5vp7" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.738789 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n528w" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.750763 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-5k4x2"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.751934 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-5k4x2" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.756746 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-w9nk6" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.759715 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-s4vf9" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.778504 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-2zwsf" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.785663 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j62s6\" (UniqueName: \"kubernetes.io/projected/27eb71ec-2145-426e-86fd-f31166b969e8-kube-api-access-j62s6\") pod \"keystone-operator-controller-manager-5bd55b4bff-rvrn9\" (UID: \"27eb71ec-2145-426e-86fd-f31166b969e8\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-rvrn9" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.785732 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b-cert\") pod \"infra-operator-controller-manager-7d857cc749-nqdfs\" (UID: \"18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-nqdfs" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.785762 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68kkb\" (UniqueName: \"kubernetes.io/projected/626c03a1-0630-42af-a1c4-af6e2c3584a5-kube-api-access-68kkb\") pod \"ironic-operator-controller-manager-7975b88857-7d78g\" (UID: \"626c03a1-0630-42af-a1c4-af6e2c3584a5\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-7d78g" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.785815 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrrv4\" (UniqueName: \"kubernetes.io/projected/18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b-kube-api-access-rrrv4\") pod \"infra-operator-controller-manager-7d857cc749-nqdfs\" (UID: \"18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-nqdfs" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.785845 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64b7d\" (UniqueName: \"kubernetes.io/projected/26e8229c-cd7b-4eab-a36c-e94d5a367224-kube-api-access-64b7d\") pod \"manila-operator-controller-manager-6d68dbc695-5k4x2\" (UID: \"26e8229c-cd7b-4eab-a36c-e94d5a367224\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-5k4x2" Sep 30 07:47:50 crc kubenswrapper[4760]: E0930 07:47:50.785949 4760 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Sep 30 07:47:50 crc kubenswrapper[4760]: E0930 07:47:50.786018 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b-cert podName:18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b nodeName:}" failed. No retries permitted until 2025-09-30 07:47:51.285998799 +0000 UTC m=+856.928905211 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b-cert") pod "infra-operator-controller-manager-7d857cc749-nqdfs" (UID: "18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b") : secret "infra-operator-webhook-server-cert" not found Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.788882 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-gnfhj"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.790019 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-gnfhj" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.791024 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-mcrx5" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.792458 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-7cxv9" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.795554 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-69mqs"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.796618 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-69mqs" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.797847 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-vqzkt" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.803145 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrrv4\" (UniqueName: \"kubernetes.io/projected/18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b-kube-api-access-rrrv4\") pod \"infra-operator-controller-manager-7d857cc749-nqdfs\" (UID: \"18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-nqdfs" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.805408 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-5k4x2"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.813968 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-95qms"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.815123 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-95qms" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.818635 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-p2qg2"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.819675 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-p2qg2" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.820210 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-m9xvl" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.821909 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-wwflx" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.822378 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-95qms"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.829254 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-gnfhj"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.835674 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-7brm5" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.836081 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-p2qg2"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.841746 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-69mqs"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.847722 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qdj9t"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.849014 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qdj9t" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.851586 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.852007 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-pnr88" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.855124 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qdj9t"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.858372 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-64gg8"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.859487 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-64gg8" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.861699 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-vflsc" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.866417 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-mzwdq"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.867431 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-mzwdq" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.869152 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-hbjn7" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.871295 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-mzwdq"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.876444 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-j5vp7" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.879798 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-kbsdn"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.889185 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68kkb\" (UniqueName: \"kubernetes.io/projected/626c03a1-0630-42af-a1c4-af6e2c3584a5-kube-api-access-68kkb\") pod \"ironic-operator-controller-manager-7975b88857-7d78g\" (UID: \"626c03a1-0630-42af-a1c4-af6e2c3584a5\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-7d78g" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.889282 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64b7d\" (UniqueName: \"kubernetes.io/projected/26e8229c-cd7b-4eab-a36c-e94d5a367224-kube-api-access-64b7d\") pod \"manila-operator-controller-manager-6d68dbc695-5k4x2\" (UID: \"26e8229c-cd7b-4eab-a36c-e94d5a367224\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-5k4x2" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.889334 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j62s6\" (UniqueName: \"kubernetes.io/projected/27eb71ec-2145-426e-86fd-f31166b969e8-kube-api-access-j62s6\") pod \"keystone-operator-controller-manager-5bd55b4bff-rvrn9\" (UID: \"27eb71ec-2145-426e-86fd-f31166b969e8\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-rvrn9" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.910700 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-64gg8"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.910736 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-kbsdn"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.910811 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-kbsdn" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.923521 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-kk2xn" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.924085 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wwngp"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.927937 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wwngp" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.929579 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j62s6\" (UniqueName: \"kubernetes.io/projected/27eb71ec-2145-426e-86fd-f31166b969e8-kube-api-access-j62s6\") pod \"keystone-operator-controller-manager-5bd55b4bff-rvrn9\" (UID: \"27eb71ec-2145-426e-86fd-f31166b969e8\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-rvrn9" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.930994 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64b7d\" (UniqueName: \"kubernetes.io/projected/26e8229c-cd7b-4eab-a36c-e94d5a367224-kube-api-access-64b7d\") pod \"manila-operator-controller-manager-6d68dbc695-5k4x2\" (UID: \"26e8229c-cd7b-4eab-a36c-e94d5a367224\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-5k4x2" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.932313 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-ftsgt" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.934917 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68kkb\" (UniqueName: \"kubernetes.io/projected/626c03a1-0630-42af-a1c4-af6e2c3584a5-kube-api-access-68kkb\") pod \"ironic-operator-controller-manager-7975b88857-7d78g\" (UID: \"626c03a1-0630-42af-a1c4-af6e2c3584a5\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-7d78g" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.953395 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wwngp"] Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.990626 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26j82\" (UniqueName: \"kubernetes.io/projected/7fac6c59-9344-46b8-b4ce-30b80c6a8b53-kube-api-access-26j82\") pod \"openstack-baremetal-operator-controller-manager-6d776955-qdj9t\" (UID: \"7fac6c59-9344-46b8-b4ce-30b80c6a8b53\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qdj9t" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.990667 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvdq7\" (UniqueName: \"kubernetes.io/projected/070b883a-da84-454e-a2d3-cc43fbf5251a-kube-api-access-kvdq7\") pod \"neutron-operator-controller-manager-64d7b59854-gnfhj\" (UID: \"070b883a-da84-454e-a2d3-cc43fbf5251a\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-gnfhj" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.990710 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2z9c\" (UniqueName: \"kubernetes.io/projected/8bc774be-38eb-4c0a-9c02-fb39c645cc28-kube-api-access-d2z9c\") pod \"ovn-operator-controller-manager-9976ff44c-64gg8\" (UID: \"8bc774be-38eb-4c0a-9c02-fb39c645cc28\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-64gg8" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.990727 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fac6c59-9344-46b8-b4ce-30b80c6a8b53-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-qdj9t\" (UID: \"7fac6c59-9344-46b8-b4ce-30b80c6a8b53\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qdj9t" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.990766 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgpjb\" (UniqueName: \"kubernetes.io/projected/000713c9-22e2-4251-b81d-e1d47a48184e-kube-api-access-hgpjb\") pod \"placement-operator-controller-manager-589c58c6c-mzwdq\" (UID: \"000713c9-22e2-4251-b81d-e1d47a48184e\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-mzwdq" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.990795 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv7m6\" (UniqueName: \"kubernetes.io/projected/c0be2186-ebe8-4634-942e-fcf6f5c0fdf6-kube-api-access-rv7m6\") pod \"nova-operator-controller-manager-c7c776c96-95qms\" (UID: \"c0be2186-ebe8-4634-942e-fcf6f5c0fdf6\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-95qms" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.990809 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr7d8\" (UniqueName: \"kubernetes.io/projected/c4bab529-6936-4f18-b4c9-4d8202e1cf6a-kube-api-access-rr7d8\") pod \"mariadb-operator-controller-manager-88c7-69mqs\" (UID: \"c4bab529-6936-4f18-b4c9-4d8202e1cf6a\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-69mqs" Sep 30 07:47:50 crc kubenswrapper[4760]: I0930 07:47:50.990829 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th2df\" (UniqueName: \"kubernetes.io/projected/be15e869-eae3-4164-a9b3-ba2d16238186-kube-api-access-th2df\") pod \"octavia-operator-controller-manager-76fcc6dc7c-p2qg2\" (UID: \"be15e869-eae3-4164-a9b3-ba2d16238186\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-p2qg2" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.112251 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-7d78g" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.116719 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2z9c\" (UniqueName: \"kubernetes.io/projected/8bc774be-38eb-4c0a-9c02-fb39c645cc28-kube-api-access-d2z9c\") pod \"ovn-operator-controller-manager-9976ff44c-64gg8\" (UID: \"8bc774be-38eb-4c0a-9c02-fb39c645cc28\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-64gg8" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.121677 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fac6c59-9344-46b8-b4ce-30b80c6a8b53-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-qdj9t\" (UID: \"7fac6c59-9344-46b8-b4ce-30b80c6a8b53\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qdj9t" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.121721 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl79h\" (UniqueName: \"kubernetes.io/projected/7a562d30-ce00-4dca-9792-6687cf729825-kube-api-access-zl79h\") pod \"swift-operator-controller-manager-bc7dc7bd9-kbsdn\" (UID: \"7a562d30-ce00-4dca-9792-6687cf729825\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-kbsdn" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.121765 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4q9l\" (UniqueName: \"kubernetes.io/projected/31832467-ab15-475b-a71b-7263e64cdff9-kube-api-access-b4q9l\") pod \"telemetry-operator-controller-manager-b8d54b5d7-wwngp\" (UID: \"31832467-ab15-475b-a71b-7263e64cdff9\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wwngp" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.121816 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgpjb\" (UniqueName: \"kubernetes.io/projected/000713c9-22e2-4251-b81d-e1d47a48184e-kube-api-access-hgpjb\") pod \"placement-operator-controller-manager-589c58c6c-mzwdq\" (UID: \"000713c9-22e2-4251-b81d-e1d47a48184e\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-mzwdq" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.121868 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv7m6\" (UniqueName: \"kubernetes.io/projected/c0be2186-ebe8-4634-942e-fcf6f5c0fdf6-kube-api-access-rv7m6\") pod \"nova-operator-controller-manager-c7c776c96-95qms\" (UID: \"c0be2186-ebe8-4634-942e-fcf6f5c0fdf6\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-95qms" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.121888 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr7d8\" (UniqueName: \"kubernetes.io/projected/c4bab529-6936-4f18-b4c9-4d8202e1cf6a-kube-api-access-rr7d8\") pod \"mariadb-operator-controller-manager-88c7-69mqs\" (UID: \"c4bab529-6936-4f18-b4c9-4d8202e1cf6a\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-69mqs" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.121921 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th2df\" (UniqueName: \"kubernetes.io/projected/be15e869-eae3-4164-a9b3-ba2d16238186-kube-api-access-th2df\") pod \"octavia-operator-controller-manager-76fcc6dc7c-p2qg2\" (UID: \"be15e869-eae3-4164-a9b3-ba2d16238186\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-p2qg2" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.121986 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26j82\" (UniqueName: \"kubernetes.io/projected/7fac6c59-9344-46b8-b4ce-30b80c6a8b53-kube-api-access-26j82\") pod \"openstack-baremetal-operator-controller-manager-6d776955-qdj9t\" (UID: \"7fac6c59-9344-46b8-b4ce-30b80c6a8b53\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qdj9t" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.122025 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvdq7\" (UniqueName: \"kubernetes.io/projected/070b883a-da84-454e-a2d3-cc43fbf5251a-kube-api-access-kvdq7\") pod \"neutron-operator-controller-manager-64d7b59854-gnfhj\" (UID: \"070b883a-da84-454e-a2d3-cc43fbf5251a\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-gnfhj" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.117903 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-5k4x2" Sep 30 07:47:51 crc kubenswrapper[4760]: E0930 07:47:51.122844 4760 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 07:47:51 crc kubenswrapper[4760]: E0930 07:47:51.122914 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fac6c59-9344-46b8-b4ce-30b80c6a8b53-cert podName:7fac6c59-9344-46b8-b4ce-30b80c6a8b53 nodeName:}" failed. No retries permitted until 2025-09-30 07:47:51.622889571 +0000 UTC m=+857.265795983 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7fac6c59-9344-46b8-b4ce-30b80c6a8b53-cert") pod "openstack-baremetal-operator-controller-manager-6d776955-qdj9t" (UID: "7fac6c59-9344-46b8-b4ce-30b80c6a8b53") : secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.118028 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-rvrn9" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.155261 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgpjb\" (UniqueName: \"kubernetes.io/projected/000713c9-22e2-4251-b81d-e1d47a48184e-kube-api-access-hgpjb\") pod \"placement-operator-controller-manager-589c58c6c-mzwdq\" (UID: \"000713c9-22e2-4251-b81d-e1d47a48184e\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-mzwdq" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.159898 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2z9c\" (UniqueName: \"kubernetes.io/projected/8bc774be-38eb-4c0a-9c02-fb39c645cc28-kube-api-access-d2z9c\") pod \"ovn-operator-controller-manager-9976ff44c-64gg8\" (UID: \"8bc774be-38eb-4c0a-9c02-fb39c645cc28\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-64gg8" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.178706 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th2df\" (UniqueName: \"kubernetes.io/projected/be15e869-eae3-4164-a9b3-ba2d16238186-kube-api-access-th2df\") pod \"octavia-operator-controller-manager-76fcc6dc7c-p2qg2\" (UID: \"be15e869-eae3-4164-a9b3-ba2d16238186\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-p2qg2" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.179380 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26j82\" (UniqueName: \"kubernetes.io/projected/7fac6c59-9344-46b8-b4ce-30b80c6a8b53-kube-api-access-26j82\") pod \"openstack-baremetal-operator-controller-manager-6d776955-qdj9t\" (UID: \"7fac6c59-9344-46b8-b4ce-30b80c6a8b53\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qdj9t" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.181509 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv7m6\" (UniqueName: \"kubernetes.io/projected/c0be2186-ebe8-4634-942e-fcf6f5c0fdf6-kube-api-access-rv7m6\") pod \"nova-operator-controller-manager-c7c776c96-95qms\" (UID: \"c0be2186-ebe8-4634-942e-fcf6f5c0fdf6\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-95qms" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.183882 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvdq7\" (UniqueName: \"kubernetes.io/projected/070b883a-da84-454e-a2d3-cc43fbf5251a-kube-api-access-kvdq7\") pod \"neutron-operator-controller-manager-64d7b59854-gnfhj\" (UID: \"070b883a-da84-454e-a2d3-cc43fbf5251a\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-gnfhj" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.191836 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr7d8\" (UniqueName: \"kubernetes.io/projected/c4bab529-6936-4f18-b4c9-4d8202e1cf6a-kube-api-access-rr7d8\") pod \"mariadb-operator-controller-manager-88c7-69mqs\" (UID: \"c4bab529-6936-4f18-b4c9-4d8202e1cf6a\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-69mqs" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.209356 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-l4g6m"] Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.210284 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-l4g6m"] Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.210322 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c459b467f-xxhnp"] Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.211898 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c459b467f-xxhnp" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.212411 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-f66b554c6-l4g6m" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.214823 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-tjzpn" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.215026 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-w9xxl" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.217890 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c459b467f-xxhnp"] Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.222560 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-p2qg2" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.222790 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfsb5\" (UniqueName: \"kubernetes.io/projected/cd75a50f-b3a1-4bef-ac18-e574ef6815ec-kube-api-access-dfsb5\") pod \"test-operator-controller-manager-f66b554c6-l4g6m\" (UID: \"cd75a50f-b3a1-4bef-ac18-e574ef6815ec\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-l4g6m" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.222844 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl79h\" (UniqueName: \"kubernetes.io/projected/7a562d30-ce00-4dca-9792-6687cf729825-kube-api-access-zl79h\") pod \"swift-operator-controller-manager-bc7dc7bd9-kbsdn\" (UID: \"7a562d30-ce00-4dca-9792-6687cf729825\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-kbsdn" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.222867 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4q9l\" (UniqueName: \"kubernetes.io/projected/31832467-ab15-475b-a71b-7263e64cdff9-kube-api-access-b4q9l\") pod \"telemetry-operator-controller-manager-b8d54b5d7-wwngp\" (UID: \"31832467-ab15-475b-a71b-7263e64cdff9\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wwngp" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.222903 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8ql8\" (UniqueName: \"kubernetes.io/projected/08bd2560-a223-4d1d-abf6-cf3686f1ded2-kube-api-access-c8ql8\") pod \"watcher-operator-controller-manager-6c459b467f-xxhnp\" (UID: \"08bd2560-a223-4d1d-abf6-cf3686f1ded2\") " pod="openstack-operators/watcher-operator-controller-manager-6c459b467f-xxhnp" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.236217 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-64gg8" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.242094 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4q9l\" (UniqueName: \"kubernetes.io/projected/31832467-ab15-475b-a71b-7263e64cdff9-kube-api-access-b4q9l\") pod \"telemetry-operator-controller-manager-b8d54b5d7-wwngp\" (UID: \"31832467-ab15-475b-a71b-7263e64cdff9\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wwngp" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.245976 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl79h\" (UniqueName: \"kubernetes.io/projected/7a562d30-ce00-4dca-9792-6687cf729825-kube-api-access-zl79h\") pod \"swift-operator-controller-manager-bc7dc7bd9-kbsdn\" (UID: \"7a562d30-ce00-4dca-9792-6687cf729825\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-kbsdn" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.266633 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-df6bd9948-rjq2r"] Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.267832 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-df6bd9948-rjq2r" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.276543 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.276714 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-6ts8p" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.295582 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-mzwdq" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.297811 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-df6bd9948-rjq2r"] Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.305028 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-kbsdn" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.326454 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlbqr\" (UniqueName: \"kubernetes.io/projected/fe420b73-f7ff-40e5-8b63-475e61942e3d-kube-api-access-rlbqr\") pod \"openstack-operator-controller-manager-df6bd9948-rjq2r\" (UID: \"fe420b73-f7ff-40e5-8b63-475e61942e3d\") " pod="openstack-operators/openstack-operator-controller-manager-df6bd9948-rjq2r" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.326561 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b-cert\") pod \"infra-operator-controller-manager-7d857cc749-nqdfs\" (UID: \"18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-nqdfs" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.326593 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfsb5\" (UniqueName: \"kubernetes.io/projected/cd75a50f-b3a1-4bef-ac18-e574ef6815ec-kube-api-access-dfsb5\") pod \"test-operator-controller-manager-f66b554c6-l4g6m\" (UID: \"cd75a50f-b3a1-4bef-ac18-e574ef6815ec\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-l4g6m" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.326680 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe420b73-f7ff-40e5-8b63-475e61942e3d-cert\") pod \"openstack-operator-controller-manager-df6bd9948-rjq2r\" (UID: \"fe420b73-f7ff-40e5-8b63-475e61942e3d\") " pod="openstack-operators/openstack-operator-controller-manager-df6bd9948-rjq2r" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.326703 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8ql8\" (UniqueName: \"kubernetes.io/projected/08bd2560-a223-4d1d-abf6-cf3686f1ded2-kube-api-access-c8ql8\") pod \"watcher-operator-controller-manager-6c459b467f-xxhnp\" (UID: \"08bd2560-a223-4d1d-abf6-cf3686f1ded2\") " pod="openstack-operators/watcher-operator-controller-manager-6c459b467f-xxhnp" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.336446 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-whhn2"] Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.337402 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-whhn2" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.338839 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b-cert\") pod \"infra-operator-controller-manager-7d857cc749-nqdfs\" (UID: \"18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-nqdfs" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.340872 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-ln4wr" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.341560 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wwngp" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.347801 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8ql8\" (UniqueName: \"kubernetes.io/projected/08bd2560-a223-4d1d-abf6-cf3686f1ded2-kube-api-access-c8ql8\") pod \"watcher-operator-controller-manager-6c459b467f-xxhnp\" (UID: \"08bd2560-a223-4d1d-abf6-cf3686f1ded2\") " pod="openstack-operators/watcher-operator-controller-manager-6c459b467f-xxhnp" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.356381 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-whhn2"] Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.372801 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfsb5\" (UniqueName: \"kubernetes.io/projected/cd75a50f-b3a1-4bef-ac18-e574ef6815ec-kube-api-access-dfsb5\") pod \"test-operator-controller-manager-f66b554c6-l4g6m\" (UID: \"cd75a50f-b3a1-4bef-ac18-e574ef6815ec\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-l4g6m" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.427865 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5pxt\" (UniqueName: \"kubernetes.io/projected/f7777c80-60ad-47c2-a76a-002f99b89d61-kube-api-access-z5pxt\") pod \"rabbitmq-cluster-operator-manager-79d8469568-whhn2\" (UID: \"f7777c80-60ad-47c2-a76a-002f99b89d61\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-whhn2" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.428184 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe420b73-f7ff-40e5-8b63-475e61942e3d-cert\") pod \"openstack-operator-controller-manager-df6bd9948-rjq2r\" (UID: \"fe420b73-f7ff-40e5-8b63-475e61942e3d\") " pod="openstack-operators/openstack-operator-controller-manager-df6bd9948-rjq2r" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.428229 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlbqr\" (UniqueName: \"kubernetes.io/projected/fe420b73-f7ff-40e5-8b63-475e61942e3d-kube-api-access-rlbqr\") pod \"openstack-operator-controller-manager-df6bd9948-rjq2r\" (UID: \"fe420b73-f7ff-40e5-8b63-475e61942e3d\") " pod="openstack-operators/openstack-operator-controller-manager-df6bd9948-rjq2r" Sep 30 07:47:51 crc kubenswrapper[4760]: E0930 07:47:51.428660 4760 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Sep 30 07:47:51 crc kubenswrapper[4760]: E0930 07:47:51.428735 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe420b73-f7ff-40e5-8b63-475e61942e3d-cert podName:fe420b73-f7ff-40e5-8b63-475e61942e3d nodeName:}" failed. No retries permitted until 2025-09-30 07:47:51.928714478 +0000 UTC m=+857.571620890 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fe420b73-f7ff-40e5-8b63-475e61942e3d-cert") pod "openstack-operator-controller-manager-df6bd9948-rjq2r" (UID: "fe420b73-f7ff-40e5-8b63-475e61942e3d") : secret "webhook-server-cert" not found Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.435943 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-gnfhj" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.448793 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlbqr\" (UniqueName: \"kubernetes.io/projected/fe420b73-f7ff-40e5-8b63-475e61942e3d-kube-api-access-rlbqr\") pod \"openstack-operator-controller-manager-df6bd9948-rjq2r\" (UID: \"fe420b73-f7ff-40e5-8b63-475e61942e3d\") " pod="openstack-operators/openstack-operator-controller-manager-df6bd9948-rjq2r" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.459188 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-69mqs" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.465458 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-95qms" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.529941 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5pxt\" (UniqueName: \"kubernetes.io/projected/f7777c80-60ad-47c2-a76a-002f99b89d61-kube-api-access-z5pxt\") pod \"rabbitmq-cluster-operator-manager-79d8469568-whhn2\" (UID: \"f7777c80-60ad-47c2-a76a-002f99b89d61\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-whhn2" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.544208 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5pxt\" (UniqueName: \"kubernetes.io/projected/f7777c80-60ad-47c2-a76a-002f99b89d61-kube-api-access-z5pxt\") pod \"rabbitmq-cluster-operator-manager-79d8469568-whhn2\" (UID: \"f7777c80-60ad-47c2-a76a-002f99b89d61\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-whhn2" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.559644 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c459b467f-xxhnp" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.576449 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-f66b554c6-l4g6m" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.606959 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-nqdfs" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.653702 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fac6c59-9344-46b8-b4ce-30b80c6a8b53-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-qdj9t\" (UID: \"7fac6c59-9344-46b8-b4ce-30b80c6a8b53\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qdj9t" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.658495 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-s4vf9"] Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.658778 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fac6c59-9344-46b8-b4ce-30b80c6a8b53-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-qdj9t\" (UID: \"7fac6c59-9344-46b8-b4ce-30b80c6a8b53\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qdj9t" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.682613 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-whhn2" Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.822458 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qdj9t" Sep 30 07:47:51 crc kubenswrapper[4760]: W0930 07:47:51.825822 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0673130a_0175_41b4_a8d8_188c7a39caa0.slice/crio-026d384e161b8fba63e410e219026324c73e9935adf8abee498e7e9291ab3a0c WatchSource:0}: Error finding container 026d384e161b8fba63e410e219026324c73e9935adf8abee498e7e9291ab3a0c: Status 404 returned error can't find the container with id 026d384e161b8fba63e410e219026324c73e9935adf8abee498e7e9291ab3a0c Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.827613 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-2zwsf"] Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.837673 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-n528w"] Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.863933 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-7brm5"] Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.869102 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-j5vp7"] Sep 30 07:47:51 crc kubenswrapper[4760]: W0930 07:47:51.882199 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90fe11d3_6b6b_46c3_9833_d68d080144b9.slice/crio-0f1681307e7a35571ac7ae0f71446f38b279cf3225197518f68bd65207819573 WatchSource:0}: Error finding container 0f1681307e7a35571ac7ae0f71446f38b279cf3225197518f68bd65207819573: Status 404 returned error can't find the container with id 0f1681307e7a35571ac7ae0f71446f38b279cf3225197518f68bd65207819573 Sep 30 07:47:51 crc kubenswrapper[4760]: I0930 07:47:51.958210 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe420b73-f7ff-40e5-8b63-475e61942e3d-cert\") pod \"openstack-operator-controller-manager-df6bd9948-rjq2r\" (UID: \"fe420b73-f7ff-40e5-8b63-475e61942e3d\") " pod="openstack-operators/openstack-operator-controller-manager-df6bd9948-rjq2r" Sep 30 07:47:51 crc kubenswrapper[4760]: E0930 07:47:51.958392 4760 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Sep 30 07:47:51 crc kubenswrapper[4760]: E0930 07:47:51.958513 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe420b73-f7ff-40e5-8b63-475e61942e3d-cert podName:fe420b73-f7ff-40e5-8b63-475e61942e3d nodeName:}" failed. No retries permitted until 2025-09-30 07:47:52.958492799 +0000 UTC m=+858.601399211 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fe420b73-f7ff-40e5-8b63-475e61942e3d-cert") pod "openstack-operator-controller-manager-df6bd9948-rjq2r" (UID: "fe420b73-f7ff-40e5-8b63-475e61942e3d") : secret "webhook-server-cert" not found Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.229249 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-mcrx5"] Sep 30 07:47:52 crc kubenswrapper[4760]: W0930 07:47:52.234215 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93eb25ad_5a9d_4044_ba79_8869b28787dd.slice/crio-c0faa3e982d3a02d6d7c6705016cf93afea6a9383198b5f62e9a88866e5b0957 WatchSource:0}: Error finding container c0faa3e982d3a02d6d7c6705016cf93afea6a9383198b5f62e9a88866e5b0957: Status 404 returned error can't find the container with id c0faa3e982d3a02d6d7c6705016cf93afea6a9383198b5f62e9a88866e5b0957 Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.243263 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-7d78g"] Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.251561 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-64gg8"] Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.256583 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-rvrn9"] Sep 30 07:47:52 crc kubenswrapper[4760]: W0930 07:47:52.259037 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bc774be_38eb_4c0a_9c02_fb39c645cc28.slice/crio-918458567fd00ddf2cde3162d6c324934f42a699bdeba510923cbdcf75c435cb WatchSource:0}: Error finding container 918458567fd00ddf2cde3162d6c324934f42a699bdeba510923cbdcf75c435cb: Status 404 returned error can't find the container with id 918458567fd00ddf2cde3162d6c324934f42a699bdeba510923cbdcf75c435cb Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.261069 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-5k4x2"] Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.266493 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-p2qg2"] Sep 30 07:47:52 crc kubenswrapper[4760]: W0930 07:47:52.283959 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe15e869_eae3_4164_a9b3_ba2d16238186.slice/crio-ead1cecdca90620798482e862f56378b4a9733e061d6e51f2003e85348c9cef9 WatchSource:0}: Error finding container ead1cecdca90620798482e862f56378b4a9733e061d6e51f2003e85348c9cef9: Status 404 returned error can't find the container with id ead1cecdca90620798482e862f56378b4a9733e061d6e51f2003e85348c9cef9 Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.285190 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-kbsdn"] Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.291919 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-mzwdq"] Sep 30 07:47:52 crc kubenswrapper[4760]: W0930 07:47:52.296929 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a562d30_ce00_4dca_9792_6687cf729825.slice/crio-37a6f308e8f1b936f2dca39bdca44ec0eb608c0b65f7b3d94d673f585b6b6876 WatchSource:0}: Error finding container 37a6f308e8f1b936f2dca39bdca44ec0eb608c0b65f7b3d94d673f585b6b6876: Status 404 returned error can't find the container with id 37a6f308e8f1b936f2dca39bdca44ec0eb608c0b65f7b3d94d673f585b6b6876 Sep 30 07:47:52 crc kubenswrapper[4760]: E0930 07:47:52.304228 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:057de94f9afa340adc34f9b25f8007d9cd2ba71bc8b5d77aac522add53b7caef,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rv7m6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-c7c776c96-95qms_openstack-operators(c0be2186-ebe8-4634-942e-fcf6f5c0fdf6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 07:47:52 crc kubenswrapper[4760]: E0930 07:47:52.308190 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hgpjb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-589c58c6c-mzwdq_openstack-operators(000713c9-22e2-4251-b81d-e1d47a48184e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.313636 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wwngp"] Sep 30 07:47:52 crc kubenswrapper[4760]: E0930 07:47:52.316251 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3c6f7d737e0196ec302f44354228d783ad3b210a75703dda3b39c15c01a67e8c,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zl79h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-bc7dc7bd9-kbsdn_openstack-operators(7a562d30-ce00-4dca-9792-6687cf729825): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.326044 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-95qms"] Sep 30 07:47:52 crc kubenswrapper[4760]: E0930 07:47:52.590104 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-95qms" podUID="c0be2186-ebe8-4634-942e-fcf6f5c0fdf6" Sep 30 07:47:52 crc kubenswrapper[4760]: E0930 07:47:52.627085 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-mzwdq" podUID="000713c9-22e2-4251-b81d-e1d47a48184e" Sep 30 07:47:52 crc kubenswrapper[4760]: E0930 07:47:52.628863 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-kbsdn" podUID="7a562d30-ce00-4dca-9792-6687cf729825" Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.662474 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c459b467f-xxhnp"] Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.669232 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-l4g6m"] Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.679682 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-69mqs"] Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.679738 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qdj9t"] Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.681481 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n528w" event={"ID":"90fe11d3-6b6b-46c3-9833-d68d080144b9","Type":"ContainerStarted","Data":"0f1681307e7a35571ac7ae0f71446f38b279cf3225197518f68bd65207819573"} Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.687259 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-j5vp7" event={"ID":"28a5f605-2c82-4747-8b3d-2704804e81ec","Type":"ContainerStarted","Data":"3812939170e55565a8a7047278d43afcefbf3dc06e5a0ea1467aee3a968f0fd7"} Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.690702 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-s4vf9" event={"ID":"7fdb76d3-726a-416a-9b64-df2d6a67d88a","Type":"ContainerStarted","Data":"d3dbe8c9d30dc7dc67bde5758ffebf926fb3deb6691de8dfbac148e575b0f1bf"} Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.693232 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-whhn2"] Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.700118 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-gnfhj"] Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.701461 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-nqdfs"] Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.705226 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-mzwdq" event={"ID":"000713c9-22e2-4251-b81d-e1d47a48184e","Type":"ContainerStarted","Data":"7a9b0f93af79063da8ab378128ebf269b83286eb1e91a269cf6ac9d6b73e862f"} Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.705262 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-mzwdq" event={"ID":"000713c9-22e2-4251-b81d-e1d47a48184e","Type":"ContainerStarted","Data":"ef79358f3385de1bc8b017738ba4873f7289cf5b98cb864f98a697728d4456ea"} Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.713536 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-5k4x2" event={"ID":"26e8229c-cd7b-4eab-a36c-e94d5a367224","Type":"ContainerStarted","Data":"9f10a47957f86dff6cb6ce45ee4f0af45eed3fe7b791e6db576c1a7f82b84642"} Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.721625 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wwngp" event={"ID":"31832467-ab15-475b-a71b-7263e64cdff9","Type":"ContainerStarted","Data":"3504487c74905a47f2653d61faa3631d69e6375f1c8c7a99f68e80078d06dcd7"} Sep 30 07:47:52 crc kubenswrapper[4760]: E0930 07:47:52.721794 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2\\\"\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-mzwdq" podUID="000713c9-22e2-4251-b81d-e1d47a48184e" Sep 30 07:47:52 crc kubenswrapper[4760]: E0930 07:47:52.725659 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.110:5001/openstack-k8s-operators/watcher-operator:3cc704e75032a4df37a2e5631a4d128900fc5c34,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c8ql8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c459b467f-xxhnp_openstack-operators(08bd2560-a223-4d1d-abf6-cf3686f1ded2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 07:47:52 crc kubenswrapper[4760]: E0930 07:47:52.733649 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_LIGHTSPEED_IMAGE_URL_DEFAULT,Value:quay.io/openstack-lightspeed/rag-content:os-docs-2024.2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-26j82,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-6d776955-qdj9t_openstack-operators(7fac6c59-9344-46b8-b4ce-30b80c6a8b53): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.734293 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-2zwsf" event={"ID":"0673130a-0175-41b4-a8d8-188c7a39caa0","Type":"ContainerStarted","Data":"026d384e161b8fba63e410e219026324c73e9935adf8abee498e7e9291ab3a0c"} Sep 30 07:47:52 crc kubenswrapper[4760]: E0930 07:47:52.739737 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:485df5c7813cdf4cf21f48ec48c8e3e4962fee6a1ae4c64f7af127d5ab346a10,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kvdq7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-64d7b59854-gnfhj_openstack-operators(070b883a-da84-454e-a2d3-cc43fbf5251a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.740334 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-7d78g" event={"ID":"626c03a1-0630-42af-a1c4-af6e2c3584a5","Type":"ContainerStarted","Data":"4d3d4cf978d1f6e281194a141bb1c1df8db5c5151ee3833f1c0f5076b3971733"} Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.742089 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-kbsdn" event={"ID":"7a562d30-ce00-4dca-9792-6687cf729825","Type":"ContainerStarted","Data":"7626a1e715154254f326a49395cdbdfe5dcdb65c5b8802d935c7c360e4af2365"} Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.742130 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-kbsdn" event={"ID":"7a562d30-ce00-4dca-9792-6687cf729825","Type":"ContainerStarted","Data":"37a6f308e8f1b936f2dca39bdca44ec0eb608c0b65f7b3d94d673f585b6b6876"} Sep 30 07:47:52 crc kubenswrapper[4760]: E0930 07:47:52.743429 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3c6f7d737e0196ec302f44354228d783ad3b210a75703dda3b39c15c01a67e8c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-kbsdn" podUID="7a562d30-ce00-4dca-9792-6687cf729825" Sep 30 07:47:52 crc kubenswrapper[4760]: W0930 07:47:52.743585 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7777c80_60ad_47c2_a76a_002f99b89d61.slice/crio-105908459a8c3b64a8b36e76ffb59e640f2cb4e4ae70000cefd1b081602f58f6 WatchSource:0}: Error finding container 105908459a8c3b64a8b36e76ffb59e640f2cb4e4ae70000cefd1b081602f58f6: Status 404 returned error can't find the container with id 105908459a8c3b64a8b36e76ffb59e640f2cb4e4ae70000cefd1b081602f58f6 Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.745282 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-mcrx5" event={"ID":"93eb25ad-5a9d-4044-ba79-8869b28787dd","Type":"ContainerStarted","Data":"c0faa3e982d3a02d6d7c6705016cf93afea6a9383198b5f62e9a88866e5b0957"} Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.749731 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-64gg8" event={"ID":"8bc774be-38eb-4c0a-9c02-fb39c645cc28","Type":"ContainerStarted","Data":"918458567fd00ddf2cde3162d6c324934f42a699bdeba510923cbdcf75c435cb"} Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.753731 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-95qms" event={"ID":"c0be2186-ebe8-4634-942e-fcf6f5c0fdf6","Type":"ContainerStarted","Data":"a2a3ac020a585e23efb491bdaba68c1dadd32846b4630510b53af667b21b28c9"} Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.753767 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-95qms" event={"ID":"c0be2186-ebe8-4634-942e-fcf6f5c0fdf6","Type":"ContainerStarted","Data":"fe2d085571c46e25e6633189333b8a3fe6b90707dbc6140fd728d88f52b8e536"} Sep 30 07:47:52 crc kubenswrapper[4760]: E0930 07:47:52.755899 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:057de94f9afa340adc34f9b25f8007d9cd2ba71bc8b5d77aac522add53b7caef\\\"\"" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-95qms" podUID="c0be2186-ebe8-4634-942e-fcf6f5c0fdf6" Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.763380 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-rvrn9" event={"ID":"27eb71ec-2145-426e-86fd-f31166b969e8","Type":"ContainerStarted","Data":"6643542ab1337d00c76a2678a94d87f94a8f1e257837cda3bc93a2f989978dbd"} Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.767079 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-p2qg2" event={"ID":"be15e869-eae3-4164-a9b3-ba2d16238186","Type":"ContainerStarted","Data":"ead1cecdca90620798482e862f56378b4a9733e061d6e51f2003e85348c9cef9"} Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.768938 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-7brm5" event={"ID":"fff7432b-8ea3-4b35-8726-640f02bd8d58","Type":"ContainerStarted","Data":"c16f25ec0cb521d3f06c9561ab03e9aa2a62e9f327d609d97c7ba80c01a41ca2"} Sep 30 07:47:52 crc kubenswrapper[4760]: E0930 07:47:52.770158 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:de99ad053f95f132f62b38335b2e8bf22fc28acbd441c3814764d63b63ef755f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rrrv4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-7d857cc749-nqdfs_openstack-operators(18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 07:47:52 crc kubenswrapper[4760]: E0930 07:47:52.960895 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qdj9t" podUID="7fac6c59-9344-46b8-b4ce-30b80c6a8b53" Sep 30 07:47:52 crc kubenswrapper[4760]: E0930 07:47:52.968605 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-nqdfs" podUID="18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b" Sep 30 07:47:52 crc kubenswrapper[4760]: E0930 07:47:52.968641 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c459b467f-xxhnp" podUID="08bd2560-a223-4d1d-abf6-cf3686f1ded2" Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.970602 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe420b73-f7ff-40e5-8b63-475e61942e3d-cert\") pod \"openstack-operator-controller-manager-df6bd9948-rjq2r\" (UID: \"fe420b73-f7ff-40e5-8b63-475e61942e3d\") " pod="openstack-operators/openstack-operator-controller-manager-df6bd9948-rjq2r" Sep 30 07:47:52 crc kubenswrapper[4760]: I0930 07:47:52.977168 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe420b73-f7ff-40e5-8b63-475e61942e3d-cert\") pod \"openstack-operator-controller-manager-df6bd9948-rjq2r\" (UID: \"fe420b73-f7ff-40e5-8b63-475e61942e3d\") " pod="openstack-operators/openstack-operator-controller-manager-df6bd9948-rjq2r" Sep 30 07:47:53 crc kubenswrapper[4760]: E0930 07:47:53.001618 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-gnfhj" podUID="070b883a-da84-454e-a2d3-cc43fbf5251a" Sep 30 07:47:53 crc kubenswrapper[4760]: I0930 07:47:53.105876 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-df6bd9948-rjq2r" Sep 30 07:47:53 crc kubenswrapper[4760]: I0930 07:47:53.688827 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-df6bd9948-rjq2r"] Sep 30 07:47:53 crc kubenswrapper[4760]: I0930 07:47:53.787665 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qdj9t" event={"ID":"7fac6c59-9344-46b8-b4ce-30b80c6a8b53","Type":"ContainerStarted","Data":"fd9a986dce0c1640ca699ab7ebe96889b30bf5daba4fae51e6f1ac1f02f8a2e1"} Sep 30 07:47:53 crc kubenswrapper[4760]: I0930 07:47:53.787704 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qdj9t" event={"ID":"7fac6c59-9344-46b8-b4ce-30b80c6a8b53","Type":"ContainerStarted","Data":"651c28c5ce013a561545691ecb7444d9f6c81263069704b35144593d0f951558"} Sep 30 07:47:53 crc kubenswrapper[4760]: E0930 07:47:53.788928 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qdj9t" podUID="7fac6c59-9344-46b8-b4ce-30b80c6a8b53" Sep 30 07:47:53 crc kubenswrapper[4760]: I0930 07:47:53.800949 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-df6bd9948-rjq2r" event={"ID":"fe420b73-f7ff-40e5-8b63-475e61942e3d","Type":"ContainerStarted","Data":"aa005677713c9e4cf26f1c643b316e31bcf15323acd2634e25d92480309b3ae3"} Sep 30 07:47:53 crc kubenswrapper[4760]: I0930 07:47:53.803350 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c459b467f-xxhnp" event={"ID":"08bd2560-a223-4d1d-abf6-cf3686f1ded2","Type":"ContainerStarted","Data":"2f73182ba70e9e6a2f90c297a3f50ab1c4011ea667d39443cde6ba71b5ae5dc1"} Sep 30 07:47:53 crc kubenswrapper[4760]: I0930 07:47:53.803373 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c459b467f-xxhnp" event={"ID":"08bd2560-a223-4d1d-abf6-cf3686f1ded2","Type":"ContainerStarted","Data":"a9b5f9a803d6f32bca5ef44587455e14e1b045d5c1b4c8bd150bc34304322ace"} Sep 30 07:47:53 crc kubenswrapper[4760]: E0930 07:47:53.805823 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.110:5001/openstack-k8s-operators/watcher-operator:3cc704e75032a4df37a2e5631a4d128900fc5c34\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c459b467f-xxhnp" podUID="08bd2560-a223-4d1d-abf6-cf3686f1ded2" Sep 30 07:47:53 crc kubenswrapper[4760]: I0930 07:47:53.806097 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-l4g6m" event={"ID":"cd75a50f-b3a1-4bef-ac18-e574ef6815ec","Type":"ContainerStarted","Data":"b5800fdfb556e6af48d7990094ae5c74b27e7f7a7ef7085222283fbf3654085d"} Sep 30 07:47:53 crc kubenswrapper[4760]: I0930 07:47:53.809552 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-gnfhj" event={"ID":"070b883a-da84-454e-a2d3-cc43fbf5251a","Type":"ContainerStarted","Data":"4b439598c609f2e33b8d46d7397c83f287c9329b7329567f6e3316b6919dee5a"} Sep 30 07:47:53 crc kubenswrapper[4760]: I0930 07:47:53.809582 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-gnfhj" event={"ID":"070b883a-da84-454e-a2d3-cc43fbf5251a","Type":"ContainerStarted","Data":"ed079ef3dfe0538d3607877f27985951f2d006192c4e90059cc539697408cb9e"} Sep 30 07:47:53 crc kubenswrapper[4760]: E0930 07:47:53.810917 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:485df5c7813cdf4cf21f48ec48c8e3e4962fee6a1ae4c64f7af127d5ab346a10\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-gnfhj" podUID="070b883a-da84-454e-a2d3-cc43fbf5251a" Sep 30 07:47:53 crc kubenswrapper[4760]: I0930 07:47:53.812329 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-whhn2" event={"ID":"f7777c80-60ad-47c2-a76a-002f99b89d61","Type":"ContainerStarted","Data":"105908459a8c3b64a8b36e76ffb59e640f2cb4e4ae70000cefd1b081602f58f6"} Sep 30 07:47:53 crc kubenswrapper[4760]: I0930 07:47:53.827619 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-nqdfs" event={"ID":"18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b","Type":"ContainerStarted","Data":"e3bfaddb90491161c6d386772251c8efb3871b529d9fe4e5a6f4339d26ead9df"} Sep 30 07:47:53 crc kubenswrapper[4760]: I0930 07:47:53.827671 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-nqdfs" event={"ID":"18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b","Type":"ContainerStarted","Data":"904e5e9c24b0a5b9f0f381eb8550bdc69a2e6c92145c044cfa06f3d6951c3f8c"} Sep 30 07:47:53 crc kubenswrapper[4760]: E0930 07:47:53.832904 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:de99ad053f95f132f62b38335b2e8bf22fc28acbd441c3814764d63b63ef755f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-nqdfs" podUID="18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b" Sep 30 07:47:53 crc kubenswrapper[4760]: I0930 07:47:53.841069 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-69mqs" event={"ID":"c4bab529-6936-4f18-b4c9-4d8202e1cf6a","Type":"ContainerStarted","Data":"de39cfa98d8c8d786f6ceac92d9923613d8bae292a95a571973a42d782df28e3"} Sep 30 07:47:53 crc kubenswrapper[4760]: E0930 07:47:53.844266 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:057de94f9afa340adc34f9b25f8007d9cd2ba71bc8b5d77aac522add53b7caef\\\"\"" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-95qms" podUID="c0be2186-ebe8-4634-942e-fcf6f5c0fdf6" Sep 30 07:47:53 crc kubenswrapper[4760]: E0930 07:47:53.845040 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3c6f7d737e0196ec302f44354228d783ad3b210a75703dda3b39c15c01a67e8c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-kbsdn" podUID="7a562d30-ce00-4dca-9792-6687cf729825" Sep 30 07:47:53 crc kubenswrapper[4760]: E0930 07:47:53.848185 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2\\\"\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-mzwdq" podUID="000713c9-22e2-4251-b81d-e1d47a48184e" Sep 30 07:47:54 crc kubenswrapper[4760]: I0930 07:47:54.847714 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-df6bd9948-rjq2r" event={"ID":"fe420b73-f7ff-40e5-8b63-475e61942e3d","Type":"ContainerStarted","Data":"bd2608d5d8bbbc383ca7440a71c2756f8e748cc1aa0e684284661ea666f9a4a2"} Sep 30 07:47:54 crc kubenswrapper[4760]: E0930 07:47:54.850507 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qdj9t" podUID="7fac6c59-9344-46b8-b4ce-30b80c6a8b53" Sep 30 07:47:54 crc kubenswrapper[4760]: E0930 07:47:54.850832 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:de99ad053f95f132f62b38335b2e8bf22fc28acbd441c3814764d63b63ef755f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-nqdfs" podUID="18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b" Sep 30 07:47:54 crc kubenswrapper[4760]: E0930 07:47:54.850879 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:485df5c7813cdf4cf21f48ec48c8e3e4962fee6a1ae4c64f7af127d5ab346a10\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-gnfhj" podUID="070b883a-da84-454e-a2d3-cc43fbf5251a" Sep 30 07:47:54 crc kubenswrapper[4760]: E0930 07:47:54.850916 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.110:5001/openstack-k8s-operators/watcher-operator:3cc704e75032a4df37a2e5631a4d128900fc5c34\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c459b467f-xxhnp" podUID="08bd2560-a223-4d1d-abf6-cf3686f1ded2" Sep 30 07:48:03 crc kubenswrapper[4760]: I0930 07:48:03.928866 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-7d78g" event={"ID":"626c03a1-0630-42af-a1c4-af6e2c3584a5","Type":"ContainerStarted","Data":"b89b8927d619fa90ecc0aec5f2f25b5c71bf792d7b1cfdab1f76291ba89fa38f"} Sep 30 07:48:03 crc kubenswrapper[4760]: I0930 07:48:03.951830 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-s4vf9" event={"ID":"7fdb76d3-726a-416a-9b64-df2d6a67d88a","Type":"ContainerStarted","Data":"604d99715d59584cc86736852730bb56868a56ae852d3b2a4a545ab533162362"} Sep 30 07:48:03 crc kubenswrapper[4760]: I0930 07:48:03.957728 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-mcrx5" event={"ID":"93eb25ad-5a9d-4044-ba79-8869b28787dd","Type":"ContainerStarted","Data":"08ea6d7782f397de0d13fae8409a356e4b930509f06d218433c6c55a9720f964"} Sep 30 07:48:03 crc kubenswrapper[4760]: I0930 07:48:03.961118 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wwngp" event={"ID":"31832467-ab15-475b-a71b-7263e64cdff9","Type":"ContainerStarted","Data":"4bd0f6606e85604be124dfe114c026f119e622a11f38da30bb7920428cbf5fa7"} Sep 30 07:48:03 crc kubenswrapper[4760]: I0930 07:48:03.966027 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-7brm5" event={"ID":"fff7432b-8ea3-4b35-8726-640f02bd8d58","Type":"ContainerStarted","Data":"121718765df23c3222fc78755621f170461492d846c31840101a19b587dda210"} Sep 30 07:48:03 crc kubenswrapper[4760]: I0930 07:48:03.969369 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n528w" event={"ID":"90fe11d3-6b6b-46c3-9833-d68d080144b9","Type":"ContainerStarted","Data":"900e35e33cadff2125a891f52bc1a2b0a2c751696b9378802082575114cb5466"} Sep 30 07:48:03 crc kubenswrapper[4760]: I0930 07:48:03.982326 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-l4g6m" event={"ID":"cd75a50f-b3a1-4bef-ac18-e574ef6815ec","Type":"ContainerStarted","Data":"f6a654121fd2983b71a7b9be775667f4e73a6406fa5be327900217ec7054cd70"} Sep 30 07:48:03 crc kubenswrapper[4760]: I0930 07:48:03.984141 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-2zwsf" event={"ID":"0673130a-0175-41b4-a8d8-188c7a39caa0","Type":"ContainerStarted","Data":"2c4b62a00d7aaf2236bc76d64347e8dab8f3bb9664cca05827b9ff8186a21bbb"} Sep 30 07:48:03 crc kubenswrapper[4760]: I0930 07:48:03.991692 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-69mqs" event={"ID":"c4bab529-6936-4f18-b4c9-4d8202e1cf6a","Type":"ContainerStarted","Data":"7a109904ac73c15fe4b80528d078204bb9a636d212613c44e1579d772942f8a2"} Sep 30 07:48:03 crc kubenswrapper[4760]: I0930 07:48:03.997966 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-df6bd9948-rjq2r" event={"ID":"fe420b73-f7ff-40e5-8b63-475e61942e3d","Type":"ContainerStarted","Data":"b0ea1a226eef454c42e417affc5e6d8b4a8358c4dcc06daf0963f754fcc1b904"} Sep 30 07:48:03 crc kubenswrapper[4760]: I0930 07:48:03.998579 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-df6bd9948-rjq2r" Sep 30 07:48:04 crc kubenswrapper[4760]: I0930 07:48:04.007599 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-df6bd9948-rjq2r" Sep 30 07:48:04 crc kubenswrapper[4760]: I0930 07:48:04.074461 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-df6bd9948-rjq2r" podStartSLOduration=13.074440871 podStartE2EDuration="13.074440871s" podCreationTimestamp="2025-09-30 07:47:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:48:04.066692303 +0000 UTC m=+869.709598715" watchObservedRunningTime="2025-09-30 07:48:04.074440871 +0000 UTC m=+869.717347283" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.004916 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n528w" event={"ID":"90fe11d3-6b6b-46c3-9833-d68d080144b9","Type":"ContainerStarted","Data":"08378ec75a7dc2c3b9f6293f3b92107efc7e2ee6e643383390af00a6e3614548"} Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.005137 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n528w" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.006827 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-l4g6m" event={"ID":"cd75a50f-b3a1-4bef-ac18-e574ef6815ec","Type":"ContainerStarted","Data":"2d9ad07d09a2d7d5524ed34d96ab67eeabfbcae9c153bdfe12ab9e64b4380a16"} Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.006895 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-f66b554c6-l4g6m" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.008513 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-rvrn9" event={"ID":"27eb71ec-2145-426e-86fd-f31166b969e8","Type":"ContainerStarted","Data":"7f09c1e002dca124c7ceb28c793d47319bfe723a088900371f3db06798d22ae8"} Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.008539 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-rvrn9" event={"ID":"27eb71ec-2145-426e-86fd-f31166b969e8","Type":"ContainerStarted","Data":"d52fc6ed0ea592cb46fbae5e052a49954e15f8dc4963bdca7b1903c1d5f82748"} Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.008629 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-rvrn9" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.010794 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-7d78g" event={"ID":"626c03a1-0630-42af-a1c4-af6e2c3584a5","Type":"ContainerStarted","Data":"1514770e5b98bb98fffd15db4c424784395fd95f70820faf8e01e4bc145b13f3"} Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.010907 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-7d78g" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.012477 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wwngp" event={"ID":"31832467-ab15-475b-a71b-7263e64cdff9","Type":"ContainerStarted","Data":"142d14339e5f2cf27e1e4c13ad8a0c30624772a41839efa2597eb3398b77ceac"} Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.012603 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wwngp" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.013929 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-64gg8" event={"ID":"8bc774be-38eb-4c0a-9c02-fb39c645cc28","Type":"ContainerStarted","Data":"185d01c5363a0fc0cd012c92f3c673b70a0873023dca01f5ccd83b2157f4524a"} Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.013983 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-64gg8" event={"ID":"8bc774be-38eb-4c0a-9c02-fb39c645cc28","Type":"ContainerStarted","Data":"aeb99240815a60e9f9a1537af81f1a15cc7b4e80ef1d0d92c4c24232b15f0a4a"} Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.014496 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-64gg8" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.015999 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-mcrx5" event={"ID":"93eb25ad-5a9d-4044-ba79-8869b28787dd","Type":"ContainerStarted","Data":"5660003d2bb2177b625acf892f2dccc59e729584365935c1f5ab95bd94f6522e"} Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.016391 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-mcrx5" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.017907 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-j5vp7" event={"ID":"28a5f605-2c82-4747-8b3d-2704804e81ec","Type":"ContainerStarted","Data":"ef5eda109c5bfc412f93cbc72fe81fbfb81bc59587198a26fdc9d6b1f73ba330"} Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.017944 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-j5vp7" event={"ID":"28a5f605-2c82-4747-8b3d-2704804e81ec","Type":"ContainerStarted","Data":"bbdd49b9b84304f0309045c15e7287a16b0ba1a16b32ec992296b56d9e348e5c"} Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.018331 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-j5vp7" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.019770 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-7brm5" event={"ID":"fff7432b-8ea3-4b35-8726-640f02bd8d58","Type":"ContainerStarted","Data":"8aa11740186d5a61fea2af180979b6e00972dc65b6a37670ad58bcb1d487bf23"} Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.020267 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-7brm5" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.021863 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-2zwsf" event={"ID":"0673130a-0175-41b4-a8d8-188c7a39caa0","Type":"ContainerStarted","Data":"16e15b4970babe6ffa45fd889914b5233c7866ac1fac68e2a85ea6e051cddf68"} Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.021992 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-2zwsf" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.023377 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-69mqs" event={"ID":"c4bab529-6936-4f18-b4c9-4d8202e1cf6a","Type":"ContainerStarted","Data":"842eeb26cb824879b9365bb7dd679259426c8b44509a64d312cc19cb13eb238c"} Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.023994 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-88c7-69mqs" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.025399 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-5k4x2" event={"ID":"26e8229c-cd7b-4eab-a36c-e94d5a367224","Type":"ContainerStarted","Data":"eeb3cec44db41ebd43342285a4dca814b0cc4a3b247ee91b3cbd2ce22eb8edd5"} Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.025433 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-5k4x2" event={"ID":"26e8229c-cd7b-4eab-a36c-e94d5a367224","Type":"ContainerStarted","Data":"12e7e30785b52dc67b8f12c8f531bcd147f53d4995607d5d8e90ecc88f463d21"} Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.025859 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-5k4x2" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.027989 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-whhn2" event={"ID":"f7777c80-60ad-47c2-a76a-002f99b89d61","Type":"ContainerStarted","Data":"b7d4b545e8833b6176c74d72728cb0d7dfdbcc4d453e0d1994afe7f060c76823"} Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.029779 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-p2qg2" event={"ID":"be15e869-eae3-4164-a9b3-ba2d16238186","Type":"ContainerStarted","Data":"d796a0d7dd2074ef6742784dc4d1a76311db25849c05d35c1719e3f186ab0045"} Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.029811 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-p2qg2" event={"ID":"be15e869-eae3-4164-a9b3-ba2d16238186","Type":"ContainerStarted","Data":"4972915f40f2d5bfe9242898daceed8e32ae640ade786475e06ce015575b5db0"} Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.030332 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-p2qg2" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.033530 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-s4vf9" event={"ID":"7fdb76d3-726a-416a-9b64-df2d6a67d88a","Type":"ContainerStarted","Data":"4cde2acbc3848f9e2378f02ced9fd1d0ddbbedb44919653db336d6169bf1e4a3"} Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.033558 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-s4vf9" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.063253 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n528w" podStartSLOduration=3.718457796 podStartE2EDuration="15.063237075s" podCreationTimestamp="2025-09-30 07:47:50 +0000 UTC" firstStartedPulling="2025-09-30 07:47:51.883874932 +0000 UTC m=+857.526781334" lastFinishedPulling="2025-09-30 07:48:03.228654201 +0000 UTC m=+868.871560613" observedRunningTime="2025-09-30 07:48:05.042575687 +0000 UTC m=+870.685482099" watchObservedRunningTime="2025-09-30 07:48:05.063237075 +0000 UTC m=+870.706143487" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.063465 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-7d78g" podStartSLOduration=4.085047396 podStartE2EDuration="15.063461291s" podCreationTimestamp="2025-09-30 07:47:50 +0000 UTC" firstStartedPulling="2025-09-30 07:47:52.239933653 +0000 UTC m=+857.882840075" lastFinishedPulling="2025-09-30 07:48:03.218347558 +0000 UTC m=+868.861253970" observedRunningTime="2025-09-30 07:48:05.058199066 +0000 UTC m=+870.701105478" watchObservedRunningTime="2025-09-30 07:48:05.063461291 +0000 UTC m=+870.706367703" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.075012 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.078337 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-5k4x2" podStartSLOduration=4.129158903 podStartE2EDuration="15.0783192s" podCreationTimestamp="2025-09-30 07:47:50 +0000 UTC" firstStartedPulling="2025-09-30 07:47:52.279089124 +0000 UTC m=+857.921995546" lastFinishedPulling="2025-09-30 07:48:03.228249421 +0000 UTC m=+868.871155843" observedRunningTime="2025-09-30 07:48:05.074529594 +0000 UTC m=+870.717436006" watchObservedRunningTime="2025-09-30 07:48:05.0783192 +0000 UTC m=+870.721225612" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.099608 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-mcrx5" podStartSLOduration=4.111255536 podStartE2EDuration="15.099590984s" podCreationTimestamp="2025-09-30 07:47:50 +0000 UTC" firstStartedPulling="2025-09-30 07:47:52.239579264 +0000 UTC m=+857.882485696" lastFinishedPulling="2025-09-30 07:48:03.227914722 +0000 UTC m=+868.870821144" observedRunningTime="2025-09-30 07:48:05.098178528 +0000 UTC m=+870.741084940" watchObservedRunningTime="2025-09-30 07:48:05.099590984 +0000 UTC m=+870.742497396" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.122561 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-rvrn9" podStartSLOduration=4.167656657 podStartE2EDuration="15.122541661s" podCreationTimestamp="2025-09-30 07:47:50 +0000 UTC" firstStartedPulling="2025-09-30 07:47:52.273033719 +0000 UTC m=+857.915940131" lastFinishedPulling="2025-09-30 07:48:03.227918723 +0000 UTC m=+868.870825135" observedRunningTime="2025-09-30 07:48:05.117714417 +0000 UTC m=+870.760620839" watchObservedRunningTime="2025-09-30 07:48:05.122541661 +0000 UTC m=+870.765448073" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.137825 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-2zwsf" podStartSLOduration=3.759210187 podStartE2EDuration="15.137808471s" podCreationTimestamp="2025-09-30 07:47:50 +0000 UTC" firstStartedPulling="2025-09-30 07:47:51.839744194 +0000 UTC m=+857.482650606" lastFinishedPulling="2025-09-30 07:48:03.218342488 +0000 UTC m=+868.861248890" observedRunningTime="2025-09-30 07:48:05.134969248 +0000 UTC m=+870.777875660" watchObservedRunningTime="2025-09-30 07:48:05.137808471 +0000 UTC m=+870.780714883" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.165318 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-j5vp7" podStartSLOduration=3.827934014 podStartE2EDuration="15.165283933s" podCreationTimestamp="2025-09-30 07:47:50 +0000 UTC" firstStartedPulling="2025-09-30 07:47:51.891325313 +0000 UTC m=+857.534231725" lastFinishedPulling="2025-09-30 07:48:03.228675232 +0000 UTC m=+868.871581644" observedRunningTime="2025-09-30 07:48:05.161401484 +0000 UTC m=+870.804307896" watchObservedRunningTime="2025-09-30 07:48:05.165283933 +0000 UTC m=+870.808190345" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.167414 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-88c7-69mqs" podStartSLOduration=4.670898971 podStartE2EDuration="15.167404358s" podCreationTimestamp="2025-09-30 07:47:50 +0000 UTC" firstStartedPulling="2025-09-30 07:47:52.721840651 +0000 UTC m=+858.364747063" lastFinishedPulling="2025-09-30 07:48:03.218346028 +0000 UTC m=+868.861252450" observedRunningTime="2025-09-30 07:48:05.149218523 +0000 UTC m=+870.792124935" watchObservedRunningTime="2025-09-30 07:48:05.167404358 +0000 UTC m=+870.810310770" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.198099 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-f66b554c6-l4g6m" podStartSLOduration=4.674930275 podStartE2EDuration="15.198079002s" podCreationTimestamp="2025-09-30 07:47:50 +0000 UTC" firstStartedPulling="2025-09-30 07:47:52.704716464 +0000 UTC m=+858.347622876" lastFinishedPulling="2025-09-30 07:48:03.227865191 +0000 UTC m=+868.870771603" observedRunningTime="2025-09-30 07:48:05.197773384 +0000 UTC m=+870.840679796" watchObservedRunningTime="2025-09-30 07:48:05.198079002 +0000 UTC m=+870.840985414" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.227624 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-64gg8" podStartSLOduration=4.2608833 podStartE2EDuration="15.227609276s" podCreationTimestamp="2025-09-30 07:47:50 +0000 UTC" firstStartedPulling="2025-09-30 07:47:52.262001457 +0000 UTC m=+857.904907869" lastFinishedPulling="2025-09-30 07:48:03.228727433 +0000 UTC m=+868.871633845" observedRunningTime="2025-09-30 07:48:05.221627684 +0000 UTC m=+870.864534096" watchObservedRunningTime="2025-09-30 07:48:05.227609276 +0000 UTC m=+870.870515688" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.248947 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wwngp" podStartSLOduration=4.323209203 podStartE2EDuration="15.248929531s" podCreationTimestamp="2025-09-30 07:47:50 +0000 UTC" firstStartedPulling="2025-09-30 07:47:52.302455891 +0000 UTC m=+857.945362303" lastFinishedPulling="2025-09-30 07:48:03.228176219 +0000 UTC m=+868.871082631" observedRunningTime="2025-09-30 07:48:05.248329696 +0000 UTC m=+870.891236108" watchObservedRunningTime="2025-09-30 07:48:05.248929531 +0000 UTC m=+870.891835943" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.270388 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-s4vf9" podStartSLOduration=3.78359913 podStartE2EDuration="15.270168074s" podCreationTimestamp="2025-09-30 07:47:50 +0000 UTC" firstStartedPulling="2025-09-30 07:47:51.741326278 +0000 UTC m=+857.384232690" lastFinishedPulling="2025-09-30 07:48:03.227895222 +0000 UTC m=+868.870801634" observedRunningTime="2025-09-30 07:48:05.26765865 +0000 UTC m=+870.910565072" watchObservedRunningTime="2025-09-30 07:48:05.270168074 +0000 UTC m=+870.913074486" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.287154 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-7brm5" podStartSLOduration=3.947965162 podStartE2EDuration="15.287138048s" podCreationTimestamp="2025-09-30 07:47:50 +0000 UTC" firstStartedPulling="2025-09-30 07:47:51.890737458 +0000 UTC m=+857.533643870" lastFinishedPulling="2025-09-30 07:48:03.229910344 +0000 UTC m=+868.872816756" observedRunningTime="2025-09-30 07:48:05.283114315 +0000 UTC m=+870.926020727" watchObservedRunningTime="2025-09-30 07:48:05.287138048 +0000 UTC m=+870.930044460" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.309584 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-p2qg2" podStartSLOduration=4.375254874 podStartE2EDuration="15.309567381s" podCreationTimestamp="2025-09-30 07:47:50 +0000 UTC" firstStartedPulling="2025-09-30 07:47:52.294249812 +0000 UTC m=+857.937156244" lastFinishedPulling="2025-09-30 07:48:03.228562339 +0000 UTC m=+868.871468751" observedRunningTime="2025-09-30 07:48:05.305894837 +0000 UTC m=+870.948801249" watchObservedRunningTime="2025-09-30 07:48:05.309567381 +0000 UTC m=+870.952473793" Sep 30 07:48:05 crc kubenswrapper[4760]: I0930 07:48:05.328897 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-whhn2" podStartSLOduration=3.764678398 podStartE2EDuration="14.328878885s" podCreationTimestamp="2025-09-30 07:47:51 +0000 UTC" firstStartedPulling="2025-09-30 07:47:52.746596264 +0000 UTC m=+858.389502676" lastFinishedPulling="2025-09-30 07:48:03.310796751 +0000 UTC m=+868.953703163" observedRunningTime="2025-09-30 07:48:05.32478188 +0000 UTC m=+870.967688292" watchObservedRunningTime="2025-09-30 07:48:05.328878885 +0000 UTC m=+870.971785307" Sep 30 07:48:07 crc kubenswrapper[4760]: I0930 07:48:07.048910 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-kbsdn" event={"ID":"7a562d30-ce00-4dca-9792-6687cf729825","Type":"ContainerStarted","Data":"a250bcf92f3aefe73092ba32d732c66764cd048cd160e56d5244ac24cbee80e7"} Sep 30 07:48:07 crc kubenswrapper[4760]: I0930 07:48:07.050542 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-kbsdn" Sep 30 07:48:09 crc kubenswrapper[4760]: I0930 07:48:09.077205 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c459b467f-xxhnp" event={"ID":"08bd2560-a223-4d1d-abf6-cf3686f1ded2","Type":"ContainerStarted","Data":"cee4aada95347eb41a5d338a92cc8fea75281d7ace6275a9bacfaa23eb02fb32"} Sep 30 07:48:09 crc kubenswrapper[4760]: I0930 07:48:09.077547 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-mzwdq" event={"ID":"000713c9-22e2-4251-b81d-e1d47a48184e","Type":"ContainerStarted","Data":"109ca53a68099ceffb19ca06a8f691a498ad6c0ca2bd30e764f746f0644dab45"} Sep 30 07:48:09 crc kubenswrapper[4760]: I0930 07:48:09.078675 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c459b467f-xxhnp" Sep 30 07:48:09 crc kubenswrapper[4760]: I0930 07:48:09.078701 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-mzwdq" Sep 30 07:48:09 crc kubenswrapper[4760]: I0930 07:48:09.095557 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-kbsdn" podStartSLOduration=4.573759147 podStartE2EDuration="19.095543044s" podCreationTimestamp="2025-09-30 07:47:50 +0000 UTC" firstStartedPulling="2025-09-30 07:47:52.316015238 +0000 UTC m=+857.958921650" lastFinishedPulling="2025-09-30 07:48:06.837799095 +0000 UTC m=+872.480705547" observedRunningTime="2025-09-30 07:48:07.07391674 +0000 UTC m=+872.716823162" watchObservedRunningTime="2025-09-30 07:48:09.095543044 +0000 UTC m=+874.738449456" Sep 30 07:48:09 crc kubenswrapper[4760]: I0930 07:48:09.096866 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-mzwdq" podStartSLOduration=3.043881423 podStartE2EDuration="19.096860508s" podCreationTimestamp="2025-09-30 07:47:50 +0000 UTC" firstStartedPulling="2025-09-30 07:47:52.308043374 +0000 UTC m=+857.950949786" lastFinishedPulling="2025-09-30 07:48:08.361022459 +0000 UTC m=+874.003928871" observedRunningTime="2025-09-30 07:48:09.093225695 +0000 UTC m=+874.736132107" watchObservedRunningTime="2025-09-30 07:48:09.096860508 +0000 UTC m=+874.739766920" Sep 30 07:48:09 crc kubenswrapper[4760]: I0930 07:48:09.115923 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c459b467f-xxhnp" podStartSLOduration=3.423979498 podStartE2EDuration="19.115910844s" podCreationTimestamp="2025-09-30 07:47:50 +0000 UTC" firstStartedPulling="2025-09-30 07:47:52.721963044 +0000 UTC m=+858.364869456" lastFinishedPulling="2025-09-30 07:48:08.41389437 +0000 UTC m=+874.056800802" observedRunningTime="2025-09-30 07:48:09.112725223 +0000 UTC m=+874.755631645" watchObservedRunningTime="2025-09-30 07:48:09.115910844 +0000 UTC m=+874.758817256" Sep 30 07:48:10 crc kubenswrapper[4760]: I0930 07:48:10.742164 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n528w" Sep 30 07:48:10 crc kubenswrapper[4760]: I0930 07:48:10.765529 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-s4vf9" Sep 30 07:48:10 crc kubenswrapper[4760]: I0930 07:48:10.783025 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-2zwsf" Sep 30 07:48:10 crc kubenswrapper[4760]: I0930 07:48:10.795533 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-mcrx5" Sep 30 07:48:10 crc kubenswrapper[4760]: I0930 07:48:10.839912 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-7brm5" Sep 30 07:48:10 crc kubenswrapper[4760]: I0930 07:48:10.884287 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-j5vp7" Sep 30 07:48:11 crc kubenswrapper[4760]: I0930 07:48:11.120263 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-rvrn9" Sep 30 07:48:11 crc kubenswrapper[4760]: I0930 07:48:11.120814 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-5k4x2" Sep 30 07:48:11 crc kubenswrapper[4760]: I0930 07:48:11.121583 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-7d78g" Sep 30 07:48:11 crc kubenswrapper[4760]: I0930 07:48:11.225378 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-p2qg2" Sep 30 07:48:11 crc kubenswrapper[4760]: I0930 07:48:11.244420 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-64gg8" Sep 30 07:48:11 crc kubenswrapper[4760]: I0930 07:48:11.345423 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wwngp" Sep 30 07:48:11 crc kubenswrapper[4760]: I0930 07:48:11.461931 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-88c7-69mqs" Sep 30 07:48:11 crc kubenswrapper[4760]: I0930 07:48:11.579529 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-f66b554c6-l4g6m" Sep 30 07:48:12 crc kubenswrapper[4760]: I0930 07:48:12.104159 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qdj9t" event={"ID":"7fac6c59-9344-46b8-b4ce-30b80c6a8b53","Type":"ContainerStarted","Data":"c26eeb6f3f8d9d084c4d390d0d1ec8e85b484ff4ce059c76fd4aba72fc92240c"} Sep 30 07:48:12 crc kubenswrapper[4760]: I0930 07:48:12.104464 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qdj9t" Sep 30 07:48:12 crc kubenswrapper[4760]: I0930 07:48:12.107633 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-gnfhj" event={"ID":"070b883a-da84-454e-a2d3-cc43fbf5251a","Type":"ContainerStarted","Data":"48bfe7268d24b5014dab2ac5862e7399a6c828e1afdea5bdbef65f19823354b8"} Sep 30 07:48:12 crc kubenswrapper[4760]: I0930 07:48:12.107893 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-gnfhj" Sep 30 07:48:12 crc kubenswrapper[4760]: I0930 07:48:12.111022 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-95qms" event={"ID":"c0be2186-ebe8-4634-942e-fcf6f5c0fdf6","Type":"ContainerStarted","Data":"fc189aadecf5a5b42279a02924722d4d92455f2fa5ee9dcd97cf431b70f07875"} Sep 30 07:48:12 crc kubenswrapper[4760]: I0930 07:48:12.111248 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-95qms" Sep 30 07:48:12 crc kubenswrapper[4760]: I0930 07:48:12.114684 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-nqdfs" event={"ID":"18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b","Type":"ContainerStarted","Data":"6bc1deea2ab2d3144bceb2c5ae4683b72ea049d602b05503a612d85b12c65cfa"} Sep 30 07:48:12 crc kubenswrapper[4760]: I0930 07:48:12.115220 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-nqdfs" Sep 30 07:48:12 crc kubenswrapper[4760]: I0930 07:48:12.151900 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qdj9t" podStartSLOduration=3.957076564 podStartE2EDuration="22.151875975s" podCreationTimestamp="2025-09-30 07:47:50 +0000 UTC" firstStartedPulling="2025-09-30 07:47:52.733080549 +0000 UTC m=+858.375986961" lastFinishedPulling="2025-09-30 07:48:10.92787996 +0000 UTC m=+876.570786372" observedRunningTime="2025-09-30 07:48:12.14229783 +0000 UTC m=+877.785204302" watchObservedRunningTime="2025-09-30 07:48:12.151875975 +0000 UTC m=+877.794782427" Sep 30 07:48:12 crc kubenswrapper[4760]: I0930 07:48:12.163921 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-95qms" podStartSLOduration=3.529982936 podStartE2EDuration="22.163894902s" podCreationTimestamp="2025-09-30 07:47:50 +0000 UTC" firstStartedPulling="2025-09-30 07:47:52.303744364 +0000 UTC m=+857.946650776" lastFinishedPulling="2025-09-30 07:48:10.93765632 +0000 UTC m=+876.580562742" observedRunningTime="2025-09-30 07:48:12.163226665 +0000 UTC m=+877.806133137" watchObservedRunningTime="2025-09-30 07:48:12.163894902 +0000 UTC m=+877.806801364" Sep 30 07:48:12 crc kubenswrapper[4760]: I0930 07:48:12.209895 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-gnfhj" podStartSLOduration=4.022168848 podStartE2EDuration="22.209876148s" podCreationTimestamp="2025-09-30 07:47:50 +0000 UTC" firstStartedPulling="2025-09-30 07:47:52.739421501 +0000 UTC m=+858.382327913" lastFinishedPulling="2025-09-30 07:48:10.927128801 +0000 UTC m=+876.570035213" observedRunningTime="2025-09-30 07:48:12.184691754 +0000 UTC m=+877.827598206" watchObservedRunningTime="2025-09-30 07:48:12.209876148 +0000 UTC m=+877.852782570" Sep 30 07:48:12 crc kubenswrapper[4760]: I0930 07:48:12.210855 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-nqdfs" podStartSLOduration=4.054449112 podStartE2EDuration="22.210846682s" podCreationTimestamp="2025-09-30 07:47:50 +0000 UTC" firstStartedPulling="2025-09-30 07:47:52.769997392 +0000 UTC m=+858.412903804" lastFinishedPulling="2025-09-30 07:48:10.926394972 +0000 UTC m=+876.569301374" observedRunningTime="2025-09-30 07:48:12.205500786 +0000 UTC m=+877.848407208" watchObservedRunningTime="2025-09-30 07:48:12.210846682 +0000 UTC m=+877.853753114" Sep 30 07:48:19 crc kubenswrapper[4760]: I0930 07:48:19.113196 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:48:19 crc kubenswrapper[4760]: I0930 07:48:19.113708 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:48:21 crc kubenswrapper[4760]: I0930 07:48:21.299623 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-mzwdq" Sep 30 07:48:21 crc kubenswrapper[4760]: I0930 07:48:21.309392 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-kbsdn" Sep 30 07:48:21 crc kubenswrapper[4760]: I0930 07:48:21.326917 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pkcmt"] Sep 30 07:48:21 crc kubenswrapper[4760]: I0930 07:48:21.330887 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pkcmt" Sep 30 07:48:21 crc kubenswrapper[4760]: I0930 07:48:21.336674 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjnx5\" (UniqueName: \"kubernetes.io/projected/41ed6688-dfb5-4dfa-8972-34f5fea18550-kube-api-access-pjnx5\") pod \"redhat-operators-pkcmt\" (UID: \"41ed6688-dfb5-4dfa-8972-34f5fea18550\") " pod="openshift-marketplace/redhat-operators-pkcmt" Sep 30 07:48:21 crc kubenswrapper[4760]: I0930 07:48:21.336939 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41ed6688-dfb5-4dfa-8972-34f5fea18550-utilities\") pod \"redhat-operators-pkcmt\" (UID: \"41ed6688-dfb5-4dfa-8972-34f5fea18550\") " pod="openshift-marketplace/redhat-operators-pkcmt" Sep 30 07:48:21 crc kubenswrapper[4760]: I0930 07:48:21.337067 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41ed6688-dfb5-4dfa-8972-34f5fea18550-catalog-content\") pod \"redhat-operators-pkcmt\" (UID: \"41ed6688-dfb5-4dfa-8972-34f5fea18550\") " pod="openshift-marketplace/redhat-operators-pkcmt" Sep 30 07:48:21 crc kubenswrapper[4760]: I0930 07:48:21.358446 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pkcmt"] Sep 30 07:48:21 crc kubenswrapper[4760]: I0930 07:48:21.438093 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjnx5\" (UniqueName: \"kubernetes.io/projected/41ed6688-dfb5-4dfa-8972-34f5fea18550-kube-api-access-pjnx5\") pod \"redhat-operators-pkcmt\" (UID: \"41ed6688-dfb5-4dfa-8972-34f5fea18550\") " pod="openshift-marketplace/redhat-operators-pkcmt" Sep 30 07:48:21 crc kubenswrapper[4760]: I0930 07:48:21.438445 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41ed6688-dfb5-4dfa-8972-34f5fea18550-utilities\") pod \"redhat-operators-pkcmt\" (UID: \"41ed6688-dfb5-4dfa-8972-34f5fea18550\") " pod="openshift-marketplace/redhat-operators-pkcmt" Sep 30 07:48:21 crc kubenswrapper[4760]: I0930 07:48:21.438576 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41ed6688-dfb5-4dfa-8972-34f5fea18550-catalog-content\") pod \"redhat-operators-pkcmt\" (UID: \"41ed6688-dfb5-4dfa-8972-34f5fea18550\") " pod="openshift-marketplace/redhat-operators-pkcmt" Sep 30 07:48:21 crc kubenswrapper[4760]: I0930 07:48:21.439204 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41ed6688-dfb5-4dfa-8972-34f5fea18550-utilities\") pod \"redhat-operators-pkcmt\" (UID: \"41ed6688-dfb5-4dfa-8972-34f5fea18550\") " pod="openshift-marketplace/redhat-operators-pkcmt" Sep 30 07:48:21 crc kubenswrapper[4760]: I0930 07:48:21.440393 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41ed6688-dfb5-4dfa-8972-34f5fea18550-catalog-content\") pod \"redhat-operators-pkcmt\" (UID: \"41ed6688-dfb5-4dfa-8972-34f5fea18550\") " pod="openshift-marketplace/redhat-operators-pkcmt" Sep 30 07:48:21 crc kubenswrapper[4760]: I0930 07:48:21.446369 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-gnfhj" Sep 30 07:48:21 crc kubenswrapper[4760]: I0930 07:48:21.459329 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjnx5\" (UniqueName: \"kubernetes.io/projected/41ed6688-dfb5-4dfa-8972-34f5fea18550-kube-api-access-pjnx5\") pod \"redhat-operators-pkcmt\" (UID: \"41ed6688-dfb5-4dfa-8972-34f5fea18550\") " pod="openshift-marketplace/redhat-operators-pkcmt" Sep 30 07:48:21 crc kubenswrapper[4760]: I0930 07:48:21.472449 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-95qms" Sep 30 07:48:21 crc kubenswrapper[4760]: I0930 07:48:21.562999 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c459b467f-xxhnp" Sep 30 07:48:21 crc kubenswrapper[4760]: I0930 07:48:21.613241 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-nqdfs" Sep 30 07:48:21 crc kubenswrapper[4760]: I0930 07:48:21.657695 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pkcmt" Sep 30 07:48:21 crc kubenswrapper[4760]: I0930 07:48:21.831786 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qdj9t" Sep 30 07:48:21 crc kubenswrapper[4760]: I0930 07:48:21.926408 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pkcmt"] Sep 30 07:48:21 crc kubenswrapper[4760]: W0930 07:48:21.931576 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41ed6688_dfb5_4dfa_8972_34f5fea18550.slice/crio-929544a983e977c4f5a50699cb9cd663d90a8009a9cf246cbbe38abfea988b29 WatchSource:0}: Error finding container 929544a983e977c4f5a50699cb9cd663d90a8009a9cf246cbbe38abfea988b29: Status 404 returned error can't find the container with id 929544a983e977c4f5a50699cb9cd663d90a8009a9cf246cbbe38abfea988b29 Sep 30 07:48:22 crc kubenswrapper[4760]: I0930 07:48:22.203836 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pkcmt" event={"ID":"41ed6688-dfb5-4dfa-8972-34f5fea18550","Type":"ContainerStarted","Data":"929544a983e977c4f5a50699cb9cd663d90a8009a9cf246cbbe38abfea988b29"} Sep 30 07:48:28 crc kubenswrapper[4760]: I0930 07:48:28.262734 4760 generic.go:334] "Generic (PLEG): container finished" podID="41ed6688-dfb5-4dfa-8972-34f5fea18550" containerID="b979f0e0ff859a4b2434fbec37384b18a5cb6f716386096a1c2f8126ce5c81a4" exitCode=0 Sep 30 07:48:28 crc kubenswrapper[4760]: I0930 07:48:28.262891 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pkcmt" event={"ID":"41ed6688-dfb5-4dfa-8972-34f5fea18550","Type":"ContainerDied","Data":"b979f0e0ff859a4b2434fbec37384b18a5cb6f716386096a1c2f8126ce5c81a4"} Sep 30 07:48:31 crc kubenswrapper[4760]: I0930 07:48:31.297162 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pkcmt" event={"ID":"41ed6688-dfb5-4dfa-8972-34f5fea18550","Type":"ContainerStarted","Data":"91f5380988f8a308a41a1d65a218814af87f051b89114fb3c83b079187b8c6cf"} Sep 30 07:48:32 crc kubenswrapper[4760]: I0930 07:48:32.305904 4760 generic.go:334] "Generic (PLEG): container finished" podID="41ed6688-dfb5-4dfa-8972-34f5fea18550" containerID="91f5380988f8a308a41a1d65a218814af87f051b89114fb3c83b079187b8c6cf" exitCode=0 Sep 30 07:48:32 crc kubenswrapper[4760]: I0930 07:48:32.305970 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pkcmt" event={"ID":"41ed6688-dfb5-4dfa-8972-34f5fea18550","Type":"ContainerDied","Data":"91f5380988f8a308a41a1d65a218814af87f051b89114fb3c83b079187b8c6cf"} Sep 30 07:48:33 crc kubenswrapper[4760]: I0930 07:48:33.316075 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pkcmt" event={"ID":"41ed6688-dfb5-4dfa-8972-34f5fea18550","Type":"ContainerStarted","Data":"ea0e4aa04e4dadd59b96f36f1213b57d576ec0cea244fd16ac3e98f90b577291"} Sep 30 07:48:33 crc kubenswrapper[4760]: I0930 07:48:33.349163 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pkcmt" podStartSLOduration=7.890889305 podStartE2EDuration="12.349142771s" podCreationTimestamp="2025-09-30 07:48:21 +0000 UTC" firstStartedPulling="2025-09-30 07:48:28.265902301 +0000 UTC m=+893.908808713" lastFinishedPulling="2025-09-30 07:48:32.724155747 +0000 UTC m=+898.367062179" observedRunningTime="2025-09-30 07:48:33.345904308 +0000 UTC m=+898.988810730" watchObservedRunningTime="2025-09-30 07:48:33.349142771 +0000 UTC m=+898.992049183" Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.008805 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8t257"] Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.014485 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-8t257" Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.019664 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-ksv25" Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.019939 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.020050 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.020199 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.034054 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8t257"] Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.079671 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bzpjr"] Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.081155 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-bzpjr" Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.085638 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.088516 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bzpjr"] Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.123611 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzpxk\" (UniqueName: \"kubernetes.io/projected/5318712b-bf53-4f94-9abd-4e1b97e8311a-kube-api-access-vzpxk\") pod \"dnsmasq-dns-78dd6ddcc-bzpjr\" (UID: \"5318712b-bf53-4f94-9abd-4e1b97e8311a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bzpjr" Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.123683 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5318712b-bf53-4f94-9abd-4e1b97e8311a-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-bzpjr\" (UID: \"5318712b-bf53-4f94-9abd-4e1b97e8311a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bzpjr" Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.123734 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5318712b-bf53-4f94-9abd-4e1b97e8311a-config\") pod \"dnsmasq-dns-78dd6ddcc-bzpjr\" (UID: \"5318712b-bf53-4f94-9abd-4e1b97e8311a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bzpjr" Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.123810 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/230224f8-bc7e-491d-8545-ce88622bd97d-config\") pod \"dnsmasq-dns-675f4bcbfc-8t257\" (UID: \"230224f8-bc7e-491d-8545-ce88622bd97d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8t257" Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.123837 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t25zg\" (UniqueName: \"kubernetes.io/projected/230224f8-bc7e-491d-8545-ce88622bd97d-kube-api-access-t25zg\") pod \"dnsmasq-dns-675f4bcbfc-8t257\" (UID: \"230224f8-bc7e-491d-8545-ce88622bd97d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8t257" Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.224634 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5318712b-bf53-4f94-9abd-4e1b97e8311a-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-bzpjr\" (UID: \"5318712b-bf53-4f94-9abd-4e1b97e8311a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bzpjr" Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.224715 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5318712b-bf53-4f94-9abd-4e1b97e8311a-config\") pod \"dnsmasq-dns-78dd6ddcc-bzpjr\" (UID: \"5318712b-bf53-4f94-9abd-4e1b97e8311a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bzpjr" Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.224803 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/230224f8-bc7e-491d-8545-ce88622bd97d-config\") pod \"dnsmasq-dns-675f4bcbfc-8t257\" (UID: \"230224f8-bc7e-491d-8545-ce88622bd97d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8t257" Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.224823 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t25zg\" (UniqueName: \"kubernetes.io/projected/230224f8-bc7e-491d-8545-ce88622bd97d-kube-api-access-t25zg\") pod \"dnsmasq-dns-675f4bcbfc-8t257\" (UID: \"230224f8-bc7e-491d-8545-ce88622bd97d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8t257" Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.224857 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzpxk\" (UniqueName: \"kubernetes.io/projected/5318712b-bf53-4f94-9abd-4e1b97e8311a-kube-api-access-vzpxk\") pod \"dnsmasq-dns-78dd6ddcc-bzpjr\" (UID: \"5318712b-bf53-4f94-9abd-4e1b97e8311a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bzpjr" Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.225717 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5318712b-bf53-4f94-9abd-4e1b97e8311a-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-bzpjr\" (UID: \"5318712b-bf53-4f94-9abd-4e1b97e8311a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bzpjr" Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.225767 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5318712b-bf53-4f94-9abd-4e1b97e8311a-config\") pod \"dnsmasq-dns-78dd6ddcc-bzpjr\" (UID: \"5318712b-bf53-4f94-9abd-4e1b97e8311a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bzpjr" Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.225878 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/230224f8-bc7e-491d-8545-ce88622bd97d-config\") pod \"dnsmasq-dns-675f4bcbfc-8t257\" (UID: \"230224f8-bc7e-491d-8545-ce88622bd97d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8t257" Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.244615 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzpxk\" (UniqueName: \"kubernetes.io/projected/5318712b-bf53-4f94-9abd-4e1b97e8311a-kube-api-access-vzpxk\") pod \"dnsmasq-dns-78dd6ddcc-bzpjr\" (UID: \"5318712b-bf53-4f94-9abd-4e1b97e8311a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bzpjr" Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.251665 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t25zg\" (UniqueName: \"kubernetes.io/projected/230224f8-bc7e-491d-8545-ce88622bd97d-kube-api-access-t25zg\") pod \"dnsmasq-dns-675f4bcbfc-8t257\" (UID: \"230224f8-bc7e-491d-8545-ce88622bd97d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8t257" Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.334488 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-8t257" Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.397958 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-bzpjr" Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.879830 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8t257"] Sep 30 07:48:40 crc kubenswrapper[4760]: W0930 07:48:40.880417 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod230224f8_bc7e_491d_8545_ce88622bd97d.slice/crio-2b6e2744bf9f83f053d60f0888131a4aeb692782d2d01f5af824a40c9aff0f47 WatchSource:0}: Error finding container 2b6e2744bf9f83f053d60f0888131a4aeb692782d2d01f5af824a40c9aff0f47: Status 404 returned error can't find the container with id 2b6e2744bf9f83f053d60f0888131a4aeb692782d2d01f5af824a40c9aff0f47 Sep 30 07:48:40 crc kubenswrapper[4760]: I0930 07:48:40.913806 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bzpjr"] Sep 30 07:48:40 crc kubenswrapper[4760]: W0930 07:48:40.921870 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5318712b_bf53_4f94_9abd_4e1b97e8311a.slice/crio-02c0b2e3dc96917c787723f8e2725cb62524ed236e5478c97dc1d1bbc5e56dba WatchSource:0}: Error finding container 02c0b2e3dc96917c787723f8e2725cb62524ed236e5478c97dc1d1bbc5e56dba: Status 404 returned error can't find the container with id 02c0b2e3dc96917c787723f8e2725cb62524ed236e5478c97dc1d1bbc5e56dba Sep 30 07:48:41 crc kubenswrapper[4760]: I0930 07:48:41.384048 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-8t257" event={"ID":"230224f8-bc7e-491d-8545-ce88622bd97d","Type":"ContainerStarted","Data":"2b6e2744bf9f83f053d60f0888131a4aeb692782d2d01f5af824a40c9aff0f47"} Sep 30 07:48:41 crc kubenswrapper[4760]: I0930 07:48:41.387106 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-bzpjr" event={"ID":"5318712b-bf53-4f94-9abd-4e1b97e8311a","Type":"ContainerStarted","Data":"02c0b2e3dc96917c787723f8e2725cb62524ed236e5478c97dc1d1bbc5e56dba"} Sep 30 07:48:41 crc kubenswrapper[4760]: I0930 07:48:41.658208 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pkcmt" Sep 30 07:48:41 crc kubenswrapper[4760]: I0930 07:48:41.658539 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pkcmt" Sep 30 07:48:41 crc kubenswrapper[4760]: I0930 07:48:41.768273 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pkcmt" Sep 30 07:48:42 crc kubenswrapper[4760]: I0930 07:48:42.440980 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pkcmt" Sep 30 07:48:42 crc kubenswrapper[4760]: I0930 07:48:42.486738 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pkcmt"] Sep 30 07:48:43 crc kubenswrapper[4760]: I0930 07:48:43.363683 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8t257"] Sep 30 07:48:43 crc kubenswrapper[4760]: I0930 07:48:43.389675 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ccxrw"] Sep 30 07:48:43 crc kubenswrapper[4760]: I0930 07:48:43.391593 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-ccxrw" Sep 30 07:48:43 crc kubenswrapper[4760]: I0930 07:48:43.400754 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ccxrw"] Sep 30 07:48:43 crc kubenswrapper[4760]: I0930 07:48:43.580068 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f6db3f0-59be-48d1-a2a7-8302680b2f4c-config\") pod \"dnsmasq-dns-666b6646f7-ccxrw\" (UID: \"4f6db3f0-59be-48d1-a2a7-8302680b2f4c\") " pod="openstack/dnsmasq-dns-666b6646f7-ccxrw" Sep 30 07:48:43 crc kubenswrapper[4760]: I0930 07:48:43.580557 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f6db3f0-59be-48d1-a2a7-8302680b2f4c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-ccxrw\" (UID: \"4f6db3f0-59be-48d1-a2a7-8302680b2f4c\") " pod="openstack/dnsmasq-dns-666b6646f7-ccxrw" Sep 30 07:48:43 crc kubenswrapper[4760]: I0930 07:48:43.580611 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd6qt\" (UniqueName: \"kubernetes.io/projected/4f6db3f0-59be-48d1-a2a7-8302680b2f4c-kube-api-access-xd6qt\") pod \"dnsmasq-dns-666b6646f7-ccxrw\" (UID: \"4f6db3f0-59be-48d1-a2a7-8302680b2f4c\") " pod="openstack/dnsmasq-dns-666b6646f7-ccxrw" Sep 30 07:48:43 crc kubenswrapper[4760]: I0930 07:48:43.682089 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f6db3f0-59be-48d1-a2a7-8302680b2f4c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-ccxrw\" (UID: \"4f6db3f0-59be-48d1-a2a7-8302680b2f4c\") " pod="openstack/dnsmasq-dns-666b6646f7-ccxrw" Sep 30 07:48:43 crc kubenswrapper[4760]: I0930 07:48:43.682133 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd6qt\" (UniqueName: \"kubernetes.io/projected/4f6db3f0-59be-48d1-a2a7-8302680b2f4c-kube-api-access-xd6qt\") pod \"dnsmasq-dns-666b6646f7-ccxrw\" (UID: \"4f6db3f0-59be-48d1-a2a7-8302680b2f4c\") " pod="openstack/dnsmasq-dns-666b6646f7-ccxrw" Sep 30 07:48:43 crc kubenswrapper[4760]: I0930 07:48:43.682213 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f6db3f0-59be-48d1-a2a7-8302680b2f4c-config\") pod \"dnsmasq-dns-666b6646f7-ccxrw\" (UID: \"4f6db3f0-59be-48d1-a2a7-8302680b2f4c\") " pod="openstack/dnsmasq-dns-666b6646f7-ccxrw" Sep 30 07:48:43 crc kubenswrapper[4760]: I0930 07:48:43.683194 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f6db3f0-59be-48d1-a2a7-8302680b2f4c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-ccxrw\" (UID: \"4f6db3f0-59be-48d1-a2a7-8302680b2f4c\") " pod="openstack/dnsmasq-dns-666b6646f7-ccxrw" Sep 30 07:48:43 crc kubenswrapper[4760]: I0930 07:48:43.683450 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f6db3f0-59be-48d1-a2a7-8302680b2f4c-config\") pod \"dnsmasq-dns-666b6646f7-ccxrw\" (UID: \"4f6db3f0-59be-48d1-a2a7-8302680b2f4c\") " pod="openstack/dnsmasq-dns-666b6646f7-ccxrw" Sep 30 07:48:43 crc kubenswrapper[4760]: I0930 07:48:43.684385 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bzpjr"] Sep 30 07:48:43 crc kubenswrapper[4760]: I0930 07:48:43.707421 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd6qt\" (UniqueName: \"kubernetes.io/projected/4f6db3f0-59be-48d1-a2a7-8302680b2f4c-kube-api-access-xd6qt\") pod \"dnsmasq-dns-666b6646f7-ccxrw\" (UID: \"4f6db3f0-59be-48d1-a2a7-8302680b2f4c\") " pod="openstack/dnsmasq-dns-666b6646f7-ccxrw" Sep 30 07:48:43 crc kubenswrapper[4760]: I0930 07:48:43.716048 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9fxn6"] Sep 30 07:48:43 crc kubenswrapper[4760]: I0930 07:48:43.716269 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-ccxrw" Sep 30 07:48:43 crc kubenswrapper[4760]: I0930 07:48:43.717904 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9fxn6" Sep 30 07:48:43 crc kubenswrapper[4760]: I0930 07:48:43.728465 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9fxn6"] Sep 30 07:48:43 crc kubenswrapper[4760]: I0930 07:48:43.784611 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6258f1d-30e6-4da1-b567-e37509041f6f-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-9fxn6\" (UID: \"d6258f1d-30e6-4da1-b567-e37509041f6f\") " pod="openstack/dnsmasq-dns-57d769cc4f-9fxn6" Sep 30 07:48:43 crc kubenswrapper[4760]: I0930 07:48:43.784691 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6258f1d-30e6-4da1-b567-e37509041f6f-config\") pod \"dnsmasq-dns-57d769cc4f-9fxn6\" (UID: \"d6258f1d-30e6-4da1-b567-e37509041f6f\") " pod="openstack/dnsmasq-dns-57d769cc4f-9fxn6" Sep 30 07:48:43 crc kubenswrapper[4760]: I0930 07:48:43.784746 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhlxx\" (UniqueName: \"kubernetes.io/projected/d6258f1d-30e6-4da1-b567-e37509041f6f-kube-api-access-lhlxx\") pod \"dnsmasq-dns-57d769cc4f-9fxn6\" (UID: \"d6258f1d-30e6-4da1-b567-e37509041f6f\") " pod="openstack/dnsmasq-dns-57d769cc4f-9fxn6" Sep 30 07:48:43 crc kubenswrapper[4760]: I0930 07:48:43.885580 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhlxx\" (UniqueName: \"kubernetes.io/projected/d6258f1d-30e6-4da1-b567-e37509041f6f-kube-api-access-lhlxx\") pod \"dnsmasq-dns-57d769cc4f-9fxn6\" (UID: \"d6258f1d-30e6-4da1-b567-e37509041f6f\") " pod="openstack/dnsmasq-dns-57d769cc4f-9fxn6" Sep 30 07:48:43 crc kubenswrapper[4760]: I0930 07:48:43.885696 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6258f1d-30e6-4da1-b567-e37509041f6f-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-9fxn6\" (UID: \"d6258f1d-30e6-4da1-b567-e37509041f6f\") " pod="openstack/dnsmasq-dns-57d769cc4f-9fxn6" Sep 30 07:48:43 crc kubenswrapper[4760]: I0930 07:48:43.885745 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6258f1d-30e6-4da1-b567-e37509041f6f-config\") pod \"dnsmasq-dns-57d769cc4f-9fxn6\" (UID: \"d6258f1d-30e6-4da1-b567-e37509041f6f\") " pod="openstack/dnsmasq-dns-57d769cc4f-9fxn6" Sep 30 07:48:43 crc kubenswrapper[4760]: I0930 07:48:43.886765 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6258f1d-30e6-4da1-b567-e37509041f6f-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-9fxn6\" (UID: \"d6258f1d-30e6-4da1-b567-e37509041f6f\") " pod="openstack/dnsmasq-dns-57d769cc4f-9fxn6" Sep 30 07:48:43 crc kubenswrapper[4760]: I0930 07:48:43.889941 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6258f1d-30e6-4da1-b567-e37509041f6f-config\") pod \"dnsmasq-dns-57d769cc4f-9fxn6\" (UID: \"d6258f1d-30e6-4da1-b567-e37509041f6f\") " pod="openstack/dnsmasq-dns-57d769cc4f-9fxn6" Sep 30 07:48:43 crc kubenswrapper[4760]: I0930 07:48:43.904323 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhlxx\" (UniqueName: \"kubernetes.io/projected/d6258f1d-30e6-4da1-b567-e37509041f6f-kube-api-access-lhlxx\") pod \"dnsmasq-dns-57d769cc4f-9fxn6\" (UID: \"d6258f1d-30e6-4da1-b567-e37509041f6f\") " pod="openstack/dnsmasq-dns-57d769cc4f-9fxn6" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.054139 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9fxn6" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.427445 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pkcmt" podUID="41ed6688-dfb5-4dfa-8972-34f5fea18550" containerName="registry-server" containerID="cri-o://ea0e4aa04e4dadd59b96f36f1213b57d576ec0cea244fd16ac3e98f90b577291" gracePeriod=2 Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.562519 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.563818 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.568778 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ptvxd" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.568799 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.569035 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.569144 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.569249 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 30 07:48:44 crc kubenswrapper[4760]: W0930 07:48:44.569377 4760 reflector.go:561] object-"openstack"/"cert-rabbitmq-svc": failed to list *v1.Secret: secrets "cert-rabbitmq-svc" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.569401 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 30 07:48:44 crc kubenswrapper[4760]: E0930 07:48:44.569409 4760 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"cert-rabbitmq-svc\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cert-rabbitmq-svc\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.586475 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.700964 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/82b71e6c-ab34-447e-87e0-a95a9f070efe-server-conf\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.701010 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/82b71e6c-ab34-447e-87e0-a95a9f070efe-pod-info\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.701032 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.701058 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/82b71e6c-ab34-447e-87e0-a95a9f070efe-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.701075 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/82b71e6c-ab34-447e-87e0-a95a9f070efe-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.701232 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/82b71e6c-ab34-447e-87e0-a95a9f070efe-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.701267 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/82b71e6c-ab34-447e-87e0-a95a9f070efe-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.701352 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/82b71e6c-ab34-447e-87e0-a95a9f070efe-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.701417 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/82b71e6c-ab34-447e-87e0-a95a9f070efe-config-data\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.701512 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/82b71e6c-ab34-447e-87e0-a95a9f070efe-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.701556 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6zbk\" (UniqueName: \"kubernetes.io/projected/82b71e6c-ab34-447e-87e0-a95a9f070efe-kube-api-access-g6zbk\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.803057 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/82b71e6c-ab34-447e-87e0-a95a9f070efe-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.803127 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/82b71e6c-ab34-447e-87e0-a95a9f070efe-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.803149 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/82b71e6c-ab34-447e-87e0-a95a9f070efe-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.803171 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/82b71e6c-ab34-447e-87e0-a95a9f070efe-config-data\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.803209 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/82b71e6c-ab34-447e-87e0-a95a9f070efe-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.803229 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6zbk\" (UniqueName: \"kubernetes.io/projected/82b71e6c-ab34-447e-87e0-a95a9f070efe-kube-api-access-g6zbk\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.803264 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/82b71e6c-ab34-447e-87e0-a95a9f070efe-server-conf\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.803284 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/82b71e6c-ab34-447e-87e0-a95a9f070efe-pod-info\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.803322 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.803348 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/82b71e6c-ab34-447e-87e0-a95a9f070efe-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.803364 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/82b71e6c-ab34-447e-87e0-a95a9f070efe-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.803778 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/82b71e6c-ab34-447e-87e0-a95a9f070efe-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.804447 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/82b71e6c-ab34-447e-87e0-a95a9f070efe-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.804577 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.805034 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/82b71e6c-ab34-447e-87e0-a95a9f070efe-config-data\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.805631 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/82b71e6c-ab34-447e-87e0-a95a9f070efe-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.807246 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/82b71e6c-ab34-447e-87e0-a95a9f070efe-server-conf\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.808328 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/82b71e6c-ab34-447e-87e0-a95a9f070efe-pod-info\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.809435 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/82b71e6c-ab34-447e-87e0-a95a9f070efe-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.813202 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/82b71e6c-ab34-447e-87e0-a95a9f070efe-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.825411 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6zbk\" (UniqueName: \"kubernetes.io/projected/82b71e6c-ab34-447e-87e0-a95a9f070efe-kube-api-access-g6zbk\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.825681 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.840023 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.843706 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.855088 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.855201 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.855228 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mbsbc" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.855418 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.855496 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.855669 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.855782 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.858990 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.904213 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.904257 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.904314 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.904354 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.904387 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.904414 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.904456 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.904479 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.904533 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.904723 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcjvh\" (UniqueName: \"kubernetes.io/projected/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-kube-api-access-lcjvh\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:44 crc kubenswrapper[4760]: I0930 07:48:44.904773 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.006286 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.006354 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.006412 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.006441 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.006460 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.006490 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcjvh\" (UniqueName: \"kubernetes.io/projected/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-kube-api-access-lcjvh\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.006510 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.006547 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.006568 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.006611 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.006646 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.006688 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.007042 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.010114 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.011134 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.011961 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.012508 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.012522 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.012990 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.013500 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.014971 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.027240 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcjvh\" (UniqueName: \"kubernetes.io/projected/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-kube-api-access-lcjvh\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.027487 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.198882 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.444889 4760 generic.go:334] "Generic (PLEG): container finished" podID="41ed6688-dfb5-4dfa-8972-34f5fea18550" containerID="ea0e4aa04e4dadd59b96f36f1213b57d576ec0cea244fd16ac3e98f90b577291" exitCode=0 Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.444922 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pkcmt" event={"ID":"41ed6688-dfb5-4dfa-8972-34f5fea18550","Type":"ContainerDied","Data":"ea0e4aa04e4dadd59b96f36f1213b57d576ec0cea244fd16ac3e98f90b577291"} Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.789410 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 30 07:48:45 crc kubenswrapper[4760]: I0930 07:48:45.806991 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/82b71e6c-ab34-447e-87e0-a95a9f070efe-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " pod="openstack/rabbitmq-server-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.082858 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.651968 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.663214 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.663262 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.666095 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.666227 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.666234 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.667843 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.668042 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-8fzf4" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.683312 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.746248 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/641818bf-a81e-4654-a8f7-c8d06fbefc6c-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") " pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.746329 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/641818bf-a81e-4654-a8f7-c8d06fbefc6c-secrets\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") " pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.746351 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/641818bf-a81e-4654-a8f7-c8d06fbefc6c-operator-scripts\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") " pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.746367 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/641818bf-a81e-4654-a8f7-c8d06fbefc6c-config-data-generated\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") " pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.746382 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/641818bf-a81e-4654-a8f7-c8d06fbefc6c-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") " pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.746405 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgvf7\" (UniqueName: \"kubernetes.io/projected/641818bf-a81e-4654-a8f7-c8d06fbefc6c-kube-api-access-lgvf7\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") " pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.746432 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/641818bf-a81e-4654-a8f7-c8d06fbefc6c-kolla-config\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") " pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.746464 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/641818bf-a81e-4654-a8f7-c8d06fbefc6c-config-data-default\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") " pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.746486 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") " pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.849184 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/641818bf-a81e-4654-a8f7-c8d06fbefc6c-config-data-generated\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") " pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.849242 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/641818bf-a81e-4654-a8f7-c8d06fbefc6c-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") " pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.849283 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgvf7\" (UniqueName: \"kubernetes.io/projected/641818bf-a81e-4654-a8f7-c8d06fbefc6c-kube-api-access-lgvf7\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") " pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.849343 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/641818bf-a81e-4654-a8f7-c8d06fbefc6c-kolla-config\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") " pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.849466 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/641818bf-a81e-4654-a8f7-c8d06fbefc6c-config-data-default\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") " pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.849505 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") " pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.849552 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/641818bf-a81e-4654-a8f7-c8d06fbefc6c-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") " pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.849604 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/641818bf-a81e-4654-a8f7-c8d06fbefc6c-secrets\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") " pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.849633 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/641818bf-a81e-4654-a8f7-c8d06fbefc6c-operator-scripts\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") " pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.851148 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/641818bf-a81e-4654-a8f7-c8d06fbefc6c-config-data-generated\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") " pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.851395 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/641818bf-a81e-4654-a8f7-c8d06fbefc6c-kolla-config\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") " pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.851757 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.851775 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/641818bf-a81e-4654-a8f7-c8d06fbefc6c-config-data-default\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") " pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.853657 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/641818bf-a81e-4654-a8f7-c8d06fbefc6c-operator-scripts\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") " pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.858812 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/641818bf-a81e-4654-a8f7-c8d06fbefc6c-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") " pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.859046 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/641818bf-a81e-4654-a8f7-c8d06fbefc6c-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") " pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.877320 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/641818bf-a81e-4654-a8f7-c8d06fbefc6c-secrets\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") " pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.878156 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") " pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.887493 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgvf7\" (UniqueName: \"kubernetes.io/projected/641818bf-a81e-4654-a8f7-c8d06fbefc6c-kube-api-access-lgvf7\") pod \"openstack-galera-0\" (UID: \"641818bf-a81e-4654-a8f7-c8d06fbefc6c\") " pod="openstack/openstack-galera-0" Sep 30 07:48:46 crc kubenswrapper[4760]: I0930 07:48:46.990722 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.352760 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.354244 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.357151 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-bb2jq" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.357465 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.357517 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.373777 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.378214 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.456803 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvkf8\" (UniqueName: \"kubernetes.io/projected/894abb89-f647-4143-904c-88b5108982cd-kube-api-access-fvkf8\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") " pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.456864 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") " pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.456987 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/894abb89-f647-4143-904c-88b5108982cd-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") " pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.457090 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/894abb89-f647-4143-904c-88b5108982cd-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") " pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.457143 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/894abb89-f647-4143-904c-88b5108982cd-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") " pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.457168 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/894abb89-f647-4143-904c-88b5108982cd-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") " pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.457342 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/894abb89-f647-4143-904c-88b5108982cd-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") " pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.457401 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/894abb89-f647-4143-904c-88b5108982cd-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") " pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.457460 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/894abb89-f647-4143-904c-88b5108982cd-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") " pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.558625 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvkf8\" (UniqueName: \"kubernetes.io/projected/894abb89-f647-4143-904c-88b5108982cd-kube-api-access-fvkf8\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") " pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.558705 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") " pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.558736 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/894abb89-f647-4143-904c-88b5108982cd-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") " pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.558765 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/894abb89-f647-4143-904c-88b5108982cd-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") " pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.558789 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/894abb89-f647-4143-904c-88b5108982cd-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") " pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.558806 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/894abb89-f647-4143-904c-88b5108982cd-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") " pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.558853 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/894abb89-f647-4143-904c-88b5108982cd-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") " pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.558888 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/894abb89-f647-4143-904c-88b5108982cd-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") " pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.558930 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/894abb89-f647-4143-904c-88b5108982cd-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") " pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.559478 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.559740 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/894abb89-f647-4143-904c-88b5108982cd-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") " pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.559741 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/894abb89-f647-4143-904c-88b5108982cd-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") " pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.560134 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/894abb89-f647-4143-904c-88b5108982cd-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") " pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.560450 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/894abb89-f647-4143-904c-88b5108982cd-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") " pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.566110 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/894abb89-f647-4143-904c-88b5108982cd-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") " pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.570026 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/894abb89-f647-4143-904c-88b5108982cd-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") " pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.577003 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/894abb89-f647-4143-904c-88b5108982cd-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") " pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.595051 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvkf8\" (UniqueName: \"kubernetes.io/projected/894abb89-f647-4143-904c-88b5108982cd-kube-api-access-fvkf8\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") " pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.595613 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"894abb89-f647-4143-904c-88b5108982cd\") " pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.689404 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.892767 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.895013 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.900917 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.901075 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.901254 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-9kvr7" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.915078 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.965845 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/598c1476-b9fa-48c1-a346-80e23448d00f-config-data\") pod \"memcached-0\" (UID: \"598c1476-b9fa-48c1-a346-80e23448d00f\") " pod="openstack/memcached-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.965918 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/598c1476-b9fa-48c1-a346-80e23448d00f-kolla-config\") pod \"memcached-0\" (UID: \"598c1476-b9fa-48c1-a346-80e23448d00f\") " pod="openstack/memcached-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.965947 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/598c1476-b9fa-48c1-a346-80e23448d00f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"598c1476-b9fa-48c1-a346-80e23448d00f\") " pod="openstack/memcached-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.965973 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/598c1476-b9fa-48c1-a346-80e23448d00f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"598c1476-b9fa-48c1-a346-80e23448d00f\") " pod="openstack/memcached-0" Sep 30 07:48:47 crc kubenswrapper[4760]: I0930 07:48:47.965993 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d74f\" (UniqueName: \"kubernetes.io/projected/598c1476-b9fa-48c1-a346-80e23448d00f-kube-api-access-8d74f\") pod \"memcached-0\" (UID: \"598c1476-b9fa-48c1-a346-80e23448d00f\") " pod="openstack/memcached-0" Sep 30 07:48:48 crc kubenswrapper[4760]: I0930 07:48:48.066836 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/598c1476-b9fa-48c1-a346-80e23448d00f-config-data\") pod \"memcached-0\" (UID: \"598c1476-b9fa-48c1-a346-80e23448d00f\") " pod="openstack/memcached-0" Sep 30 07:48:48 crc kubenswrapper[4760]: I0930 07:48:48.066909 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/598c1476-b9fa-48c1-a346-80e23448d00f-kolla-config\") pod \"memcached-0\" (UID: \"598c1476-b9fa-48c1-a346-80e23448d00f\") " pod="openstack/memcached-0" Sep 30 07:48:48 crc kubenswrapper[4760]: I0930 07:48:48.066942 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/598c1476-b9fa-48c1-a346-80e23448d00f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"598c1476-b9fa-48c1-a346-80e23448d00f\") " pod="openstack/memcached-0" Sep 30 07:48:48 crc kubenswrapper[4760]: I0930 07:48:48.066969 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/598c1476-b9fa-48c1-a346-80e23448d00f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"598c1476-b9fa-48c1-a346-80e23448d00f\") " pod="openstack/memcached-0" Sep 30 07:48:48 crc kubenswrapper[4760]: I0930 07:48:48.066987 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d74f\" (UniqueName: \"kubernetes.io/projected/598c1476-b9fa-48c1-a346-80e23448d00f-kube-api-access-8d74f\") pod \"memcached-0\" (UID: \"598c1476-b9fa-48c1-a346-80e23448d00f\") " pod="openstack/memcached-0" Sep 30 07:48:48 crc kubenswrapper[4760]: I0930 07:48:48.067900 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/598c1476-b9fa-48c1-a346-80e23448d00f-kolla-config\") pod \"memcached-0\" (UID: \"598c1476-b9fa-48c1-a346-80e23448d00f\") " pod="openstack/memcached-0" Sep 30 07:48:48 crc kubenswrapper[4760]: I0930 07:48:48.068428 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/598c1476-b9fa-48c1-a346-80e23448d00f-config-data\") pod \"memcached-0\" (UID: \"598c1476-b9fa-48c1-a346-80e23448d00f\") " pod="openstack/memcached-0" Sep 30 07:48:48 crc kubenswrapper[4760]: I0930 07:48:48.087041 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/598c1476-b9fa-48c1-a346-80e23448d00f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"598c1476-b9fa-48c1-a346-80e23448d00f\") " pod="openstack/memcached-0" Sep 30 07:48:48 crc kubenswrapper[4760]: I0930 07:48:48.087127 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/598c1476-b9fa-48c1-a346-80e23448d00f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"598c1476-b9fa-48c1-a346-80e23448d00f\") " pod="openstack/memcached-0" Sep 30 07:48:48 crc kubenswrapper[4760]: I0930 07:48:48.089835 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d74f\" (UniqueName: \"kubernetes.io/projected/598c1476-b9fa-48c1-a346-80e23448d00f-kube-api-access-8d74f\") pod \"memcached-0\" (UID: \"598c1476-b9fa-48c1-a346-80e23448d00f\") " pod="openstack/memcached-0" Sep 30 07:48:48 crc kubenswrapper[4760]: I0930 07:48:48.215961 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 07:48:49 crc kubenswrapper[4760]: I0930 07:48:49.113394 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:48:49 crc kubenswrapper[4760]: I0930 07:48:49.113726 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:48:49 crc kubenswrapper[4760]: I0930 07:48:49.506350 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 07:48:49 crc kubenswrapper[4760]: I0930 07:48:49.507289 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 07:48:49 crc kubenswrapper[4760]: I0930 07:48:49.509356 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-72cnt" Sep 30 07:48:49 crc kubenswrapper[4760]: I0930 07:48:49.525707 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 07:48:49 crc kubenswrapper[4760]: I0930 07:48:49.594169 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxnfd\" (UniqueName: \"kubernetes.io/projected/23bad0af-c21e-4ba1-bc39-39c48f0fea56-kube-api-access-mxnfd\") pod \"kube-state-metrics-0\" (UID: \"23bad0af-c21e-4ba1-bc39-39c48f0fea56\") " pod="openstack/kube-state-metrics-0" Sep 30 07:48:49 crc kubenswrapper[4760]: I0930 07:48:49.695365 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxnfd\" (UniqueName: \"kubernetes.io/projected/23bad0af-c21e-4ba1-bc39-39c48f0fea56-kube-api-access-mxnfd\") pod \"kube-state-metrics-0\" (UID: \"23bad0af-c21e-4ba1-bc39-39c48f0fea56\") " pod="openstack/kube-state-metrics-0" Sep 30 07:48:49 crc kubenswrapper[4760]: I0930 07:48:49.696406 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pkcmt" Sep 30 07:48:49 crc kubenswrapper[4760]: I0930 07:48:49.758826 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxnfd\" (UniqueName: \"kubernetes.io/projected/23bad0af-c21e-4ba1-bc39-39c48f0fea56-kube-api-access-mxnfd\") pod \"kube-state-metrics-0\" (UID: \"23bad0af-c21e-4ba1-bc39-39c48f0fea56\") " pod="openstack/kube-state-metrics-0" Sep 30 07:48:49 crc kubenswrapper[4760]: I0930 07:48:49.799107 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41ed6688-dfb5-4dfa-8972-34f5fea18550-utilities\") pod \"41ed6688-dfb5-4dfa-8972-34f5fea18550\" (UID: \"41ed6688-dfb5-4dfa-8972-34f5fea18550\") " Sep 30 07:48:49 crc kubenswrapper[4760]: I0930 07:48:49.799159 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41ed6688-dfb5-4dfa-8972-34f5fea18550-catalog-content\") pod \"41ed6688-dfb5-4dfa-8972-34f5fea18550\" (UID: \"41ed6688-dfb5-4dfa-8972-34f5fea18550\") " Sep 30 07:48:49 crc kubenswrapper[4760]: I0930 07:48:49.799272 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjnx5\" (UniqueName: \"kubernetes.io/projected/41ed6688-dfb5-4dfa-8972-34f5fea18550-kube-api-access-pjnx5\") pod \"41ed6688-dfb5-4dfa-8972-34f5fea18550\" (UID: \"41ed6688-dfb5-4dfa-8972-34f5fea18550\") " Sep 30 07:48:49 crc kubenswrapper[4760]: I0930 07:48:49.800103 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41ed6688-dfb5-4dfa-8972-34f5fea18550-utilities" (OuterVolumeSpecName: "utilities") pod "41ed6688-dfb5-4dfa-8972-34f5fea18550" (UID: "41ed6688-dfb5-4dfa-8972-34f5fea18550"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:48:49 crc kubenswrapper[4760]: I0930 07:48:49.800414 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41ed6688-dfb5-4dfa-8972-34f5fea18550-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:48:49 crc kubenswrapper[4760]: I0930 07:48:49.811509 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41ed6688-dfb5-4dfa-8972-34f5fea18550-kube-api-access-pjnx5" (OuterVolumeSpecName: "kube-api-access-pjnx5") pod "41ed6688-dfb5-4dfa-8972-34f5fea18550" (UID: "41ed6688-dfb5-4dfa-8972-34f5fea18550"). InnerVolumeSpecName "kube-api-access-pjnx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:48:49 crc kubenswrapper[4760]: I0930 07:48:49.826597 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 07:48:49 crc kubenswrapper[4760]: I0930 07:48:49.904468 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjnx5\" (UniqueName: \"kubernetes.io/projected/41ed6688-dfb5-4dfa-8972-34f5fea18550-kube-api-access-pjnx5\") on node \"crc\" DevicePath \"\"" Sep 30 07:48:49 crc kubenswrapper[4760]: I0930 07:48:49.979902 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41ed6688-dfb5-4dfa-8972-34f5fea18550-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41ed6688-dfb5-4dfa-8972-34f5fea18550" (UID: "41ed6688-dfb5-4dfa-8972-34f5fea18550"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:48:50 crc kubenswrapper[4760]: I0930 07:48:50.006126 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41ed6688-dfb5-4dfa-8972-34f5fea18550-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:48:50 crc kubenswrapper[4760]: I0930 07:48:50.321393 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ccxrw"] Sep 30 07:48:50 crc kubenswrapper[4760]: I0930 07:48:50.481779 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pkcmt" event={"ID":"41ed6688-dfb5-4dfa-8972-34f5fea18550","Type":"ContainerDied","Data":"929544a983e977c4f5a50699cb9cd663d90a8009a9cf246cbbe38abfea988b29"} Sep 30 07:48:50 crc kubenswrapper[4760]: I0930 07:48:50.481821 4760 scope.go:117] "RemoveContainer" containerID="ea0e4aa04e4dadd59b96f36f1213b57d576ec0cea244fd16ac3e98f90b577291" Sep 30 07:48:50 crc kubenswrapper[4760]: I0930 07:48:50.481943 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pkcmt" Sep 30 07:48:50 crc kubenswrapper[4760]: I0930 07:48:50.513823 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pkcmt"] Sep 30 07:48:50 crc kubenswrapper[4760]: I0930 07:48:50.519139 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pkcmt"] Sep 30 07:48:50 crc kubenswrapper[4760]: I0930 07:48:50.889089 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 07:48:50 crc kubenswrapper[4760]: E0930 07:48:50.889403 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ed6688-dfb5-4dfa-8972-34f5fea18550" containerName="extract-utilities" Sep 30 07:48:50 crc kubenswrapper[4760]: I0930 07:48:50.889418 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ed6688-dfb5-4dfa-8972-34f5fea18550" containerName="extract-utilities" Sep 30 07:48:50 crc kubenswrapper[4760]: E0930 07:48:50.889441 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ed6688-dfb5-4dfa-8972-34f5fea18550" containerName="registry-server" Sep 30 07:48:50 crc kubenswrapper[4760]: I0930 07:48:50.889449 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ed6688-dfb5-4dfa-8972-34f5fea18550" containerName="registry-server" Sep 30 07:48:50 crc kubenswrapper[4760]: E0930 07:48:50.889461 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ed6688-dfb5-4dfa-8972-34f5fea18550" containerName="extract-content" Sep 30 07:48:50 crc kubenswrapper[4760]: I0930 07:48:50.889466 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ed6688-dfb5-4dfa-8972-34f5fea18550" containerName="extract-content" Sep 30 07:48:50 crc kubenswrapper[4760]: I0930 07:48:50.889617 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="41ed6688-dfb5-4dfa-8972-34f5fea18550" containerName="registry-server" Sep 30 07:48:50 crc kubenswrapper[4760]: I0930 07:48:50.890853 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 07:48:50 crc kubenswrapper[4760]: I0930 07:48:50.893377 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Sep 30 07:48:50 crc kubenswrapper[4760]: I0930 07:48:50.893532 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Sep 30 07:48:50 crc kubenswrapper[4760]: I0930 07:48:50.893671 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Sep 30 07:48:50 crc kubenswrapper[4760]: I0930 07:48:50.894802 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Sep 30 07:48:50 crc kubenswrapper[4760]: I0930 07:48:50.897820 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-pr2wc" Sep 30 07:48:50 crc kubenswrapper[4760]: I0930 07:48:50.908614 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Sep 30 07:48:50 crc kubenswrapper[4760]: I0930 07:48:50.909030 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.021109 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/007888b6-d5c6-410a-955a-ed78adf759bd-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.021152 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/007888b6-d5c6-410a-955a-ed78adf759bd-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.021172 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/007888b6-d5c6-410a-955a-ed78adf759bd-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.021240 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\") pod \"prometheus-metric-storage-0\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.021259 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzlmc\" (UniqueName: \"kubernetes.io/projected/007888b6-d5c6-410a-955a-ed78adf759bd-kube-api-access-mzlmc\") pod \"prometheus-metric-storage-0\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.021280 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/007888b6-d5c6-410a-955a-ed78adf759bd-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.021342 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/007888b6-d5c6-410a-955a-ed78adf759bd-config\") pod \"prometheus-metric-storage-0\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.021366 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/007888b6-d5c6-410a-955a-ed78adf759bd-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.077423 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41ed6688-dfb5-4dfa-8972-34f5fea18550" path="/var/lib/kubelet/pods/41ed6688-dfb5-4dfa-8972-34f5fea18550/volumes" Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.124326 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/007888b6-d5c6-410a-955a-ed78adf759bd-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.124440 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/007888b6-d5c6-410a-955a-ed78adf759bd-config\") pod \"prometheus-metric-storage-0\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.124487 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/007888b6-d5c6-410a-955a-ed78adf759bd-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.124526 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/007888b6-d5c6-410a-955a-ed78adf759bd-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.124550 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/007888b6-d5c6-410a-955a-ed78adf759bd-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.124568 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/007888b6-d5c6-410a-955a-ed78adf759bd-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.124649 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\") pod \"prometheus-metric-storage-0\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.124677 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzlmc\" (UniqueName: \"kubernetes.io/projected/007888b6-d5c6-410a-955a-ed78adf759bd-kube-api-access-mzlmc\") pod \"prometheus-metric-storage-0\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.126187 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/007888b6-d5c6-410a-955a-ed78adf759bd-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.128587 4760 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.128794 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\") pod \"prometheus-metric-storage-0\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/15f1bcf6ef2a65343cb29c53094f20376472cc1b8d5a343d6a63d664da0c3f7a/globalmount\"" pod="openstack/prometheus-metric-storage-0" Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.129425 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/007888b6-d5c6-410a-955a-ed78adf759bd-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.130019 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/007888b6-d5c6-410a-955a-ed78adf759bd-config\") pod \"prometheus-metric-storage-0\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.130075 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/007888b6-d5c6-410a-955a-ed78adf759bd-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.130229 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/007888b6-d5c6-410a-955a-ed78adf759bd-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.131982 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/007888b6-d5c6-410a-955a-ed78adf759bd-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.145016 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzlmc\" (UniqueName: \"kubernetes.io/projected/007888b6-d5c6-410a-955a-ed78adf759bd-kube-api-access-mzlmc\") pod \"prometheus-metric-storage-0\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.176793 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\") pod \"prometheus-metric-storage-0\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:48:51 crc kubenswrapper[4760]: I0930 07:48:51.203826 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.839352 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.840952 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.843269 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.843378 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.843402 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.844750 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-5599d" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.845912 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.854134 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.874412 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2f7adda-d2ed-4c87-8e63-64e344155305-config\") pod \"ovsdbserver-sb-0\" (UID: \"d2f7adda-d2ed-4c87-8e63-64e344155305\") " pod="openstack/ovsdbserver-sb-0" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.874455 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f72dr\" (UniqueName: \"kubernetes.io/projected/d2f7adda-d2ed-4c87-8e63-64e344155305-kube-api-access-f72dr\") pod \"ovsdbserver-sb-0\" (UID: \"d2f7adda-d2ed-4c87-8e63-64e344155305\") " pod="openstack/ovsdbserver-sb-0" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.874488 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d2f7adda-d2ed-4c87-8e63-64e344155305\") " pod="openstack/ovsdbserver-sb-0" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.874508 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2f7adda-d2ed-4c87-8e63-64e344155305-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d2f7adda-d2ed-4c87-8e63-64e344155305\") " pod="openstack/ovsdbserver-sb-0" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.874533 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2f7adda-d2ed-4c87-8e63-64e344155305-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d2f7adda-d2ed-4c87-8e63-64e344155305\") " pod="openstack/ovsdbserver-sb-0" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.874574 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d2f7adda-d2ed-4c87-8e63-64e344155305-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d2f7adda-d2ed-4c87-8e63-64e344155305\") " pod="openstack/ovsdbserver-sb-0" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.874589 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2f7adda-d2ed-4c87-8e63-64e344155305-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d2f7adda-d2ed-4c87-8e63-64e344155305\") " pod="openstack/ovsdbserver-sb-0" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.874606 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2f7adda-d2ed-4c87-8e63-64e344155305-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d2f7adda-d2ed-4c87-8e63-64e344155305\") " pod="openstack/ovsdbserver-sb-0" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.975257 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2f7adda-d2ed-4c87-8e63-64e344155305-config\") pod \"ovsdbserver-sb-0\" (UID: \"d2f7adda-d2ed-4c87-8e63-64e344155305\") " pod="openstack/ovsdbserver-sb-0" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.975325 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f72dr\" (UniqueName: \"kubernetes.io/projected/d2f7adda-d2ed-4c87-8e63-64e344155305-kube-api-access-f72dr\") pod \"ovsdbserver-sb-0\" (UID: \"d2f7adda-d2ed-4c87-8e63-64e344155305\") " pod="openstack/ovsdbserver-sb-0" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.975369 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d2f7adda-d2ed-4c87-8e63-64e344155305\") " pod="openstack/ovsdbserver-sb-0" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.975394 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2f7adda-d2ed-4c87-8e63-64e344155305-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d2f7adda-d2ed-4c87-8e63-64e344155305\") " pod="openstack/ovsdbserver-sb-0" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.975428 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2f7adda-d2ed-4c87-8e63-64e344155305-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d2f7adda-d2ed-4c87-8e63-64e344155305\") " pod="openstack/ovsdbserver-sb-0" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.975480 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d2f7adda-d2ed-4c87-8e63-64e344155305-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d2f7adda-d2ed-4c87-8e63-64e344155305\") " pod="openstack/ovsdbserver-sb-0" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.975502 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2f7adda-d2ed-4c87-8e63-64e344155305-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d2f7adda-d2ed-4c87-8e63-64e344155305\") " pod="openstack/ovsdbserver-sb-0" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.975521 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2f7adda-d2ed-4c87-8e63-64e344155305-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d2f7adda-d2ed-4c87-8e63-64e344155305\") " pod="openstack/ovsdbserver-sb-0" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.975851 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d2f7adda-d2ed-4c87-8e63-64e344155305\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-sb-0" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.976186 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d2f7adda-d2ed-4c87-8e63-64e344155305-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d2f7adda-d2ed-4c87-8e63-64e344155305\") " pod="openstack/ovsdbserver-sb-0" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.977027 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2f7adda-d2ed-4c87-8e63-64e344155305-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d2f7adda-d2ed-4c87-8e63-64e344155305\") " pod="openstack/ovsdbserver-sb-0" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.981100 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2f7adda-d2ed-4c87-8e63-64e344155305-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d2f7adda-d2ed-4c87-8e63-64e344155305\") " pod="openstack/ovsdbserver-sb-0" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.981827 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2f7adda-d2ed-4c87-8e63-64e344155305-config\") pod \"ovsdbserver-sb-0\" (UID: \"d2f7adda-d2ed-4c87-8e63-64e344155305\") " pod="openstack/ovsdbserver-sb-0" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.983227 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2f7adda-d2ed-4c87-8e63-64e344155305-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d2f7adda-d2ed-4c87-8e63-64e344155305\") " pod="openstack/ovsdbserver-sb-0" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.983619 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2f7adda-d2ed-4c87-8e63-64e344155305-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d2f7adda-d2ed-4c87-8e63-64e344155305\") " pod="openstack/ovsdbserver-sb-0" Sep 30 07:48:53 crc kubenswrapper[4760]: I0930 07:48:53.998720 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d2f7adda-d2ed-4c87-8e63-64e344155305\") " pod="openstack/ovsdbserver-sb-0" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.010220 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f72dr\" (UniqueName: \"kubernetes.io/projected/d2f7adda-d2ed-4c87-8e63-64e344155305-kube-api-access-f72dr\") pod \"ovsdbserver-sb-0\" (UID: \"d2f7adda-d2ed-4c87-8e63-64e344155305\") " pod="openstack/ovsdbserver-sb-0" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.161069 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.291557 4760 scope.go:117] "RemoveContainer" containerID="91f5380988f8a308a41a1d65a218814af87f051b89114fb3c83b079187b8c6cf" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.522597 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-ccxrw" event={"ID":"4f6db3f0-59be-48d1-a2a7-8302680b2f4c","Type":"ContainerStarted","Data":"8834c44b95935ffd49c473721082dbfb8c1154c6357bfc08919770e948bcfb03"} Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.663264 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9fxn6"] Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.686573 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-56wgh"] Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.687506 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-56wgh" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.693700 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.693703 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.701113 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-56wgh"] Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.703776 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-5gm5c" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.734916 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-bwrv9"] Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.744128 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bwrv9" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.767008 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bwrv9"] Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.794906 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/159ee554-1b62-4fe3-95c6-e64ab0c58b2d-ovn-controller-tls-certs\") pod \"ovn-controller-56wgh\" (UID: \"159ee554-1b62-4fe3-95c6-e64ab0c58b2d\") " pod="openstack/ovn-controller-56wgh" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.794955 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/159ee554-1b62-4fe3-95c6-e64ab0c58b2d-var-run-ovn\") pod \"ovn-controller-56wgh\" (UID: \"159ee554-1b62-4fe3-95c6-e64ab0c58b2d\") " pod="openstack/ovn-controller-56wgh" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.794993 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8735bd7c-231f-47df-a404-b8cab84f0d7b-etc-ovs\") pod \"ovn-controller-ovs-bwrv9\" (UID: \"8735bd7c-231f-47df-a404-b8cab84f0d7b\") " pod="openstack/ovn-controller-ovs-bwrv9" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.795095 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47vmt\" (UniqueName: \"kubernetes.io/projected/159ee554-1b62-4fe3-95c6-e64ab0c58b2d-kube-api-access-47vmt\") pod \"ovn-controller-56wgh\" (UID: \"159ee554-1b62-4fe3-95c6-e64ab0c58b2d\") " pod="openstack/ovn-controller-56wgh" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.795198 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8735bd7c-231f-47df-a404-b8cab84f0d7b-var-lib\") pod \"ovn-controller-ovs-bwrv9\" (UID: \"8735bd7c-231f-47df-a404-b8cab84f0d7b\") " pod="openstack/ovn-controller-ovs-bwrv9" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.795280 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8735bd7c-231f-47df-a404-b8cab84f0d7b-scripts\") pod \"ovn-controller-ovs-bwrv9\" (UID: \"8735bd7c-231f-47df-a404-b8cab84f0d7b\") " pod="openstack/ovn-controller-ovs-bwrv9" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.795402 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8735bd7c-231f-47df-a404-b8cab84f0d7b-var-run\") pod \"ovn-controller-ovs-bwrv9\" (UID: \"8735bd7c-231f-47df-a404-b8cab84f0d7b\") " pod="openstack/ovn-controller-ovs-bwrv9" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.795439 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/159ee554-1b62-4fe3-95c6-e64ab0c58b2d-scripts\") pod \"ovn-controller-56wgh\" (UID: \"159ee554-1b62-4fe3-95c6-e64ab0c58b2d\") " pod="openstack/ovn-controller-56wgh" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.795493 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8735bd7c-231f-47df-a404-b8cab84f0d7b-var-log\") pod \"ovn-controller-ovs-bwrv9\" (UID: \"8735bd7c-231f-47df-a404-b8cab84f0d7b\") " pod="openstack/ovn-controller-ovs-bwrv9" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.795530 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/159ee554-1b62-4fe3-95c6-e64ab0c58b2d-combined-ca-bundle\") pod \"ovn-controller-56wgh\" (UID: \"159ee554-1b62-4fe3-95c6-e64ab0c58b2d\") " pod="openstack/ovn-controller-56wgh" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.795609 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/159ee554-1b62-4fe3-95c6-e64ab0c58b2d-var-run\") pod \"ovn-controller-56wgh\" (UID: \"159ee554-1b62-4fe3-95c6-e64ab0c58b2d\") " pod="openstack/ovn-controller-56wgh" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.795680 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4xx4\" (UniqueName: \"kubernetes.io/projected/8735bd7c-231f-47df-a404-b8cab84f0d7b-kube-api-access-n4xx4\") pod \"ovn-controller-ovs-bwrv9\" (UID: \"8735bd7c-231f-47df-a404-b8cab84f0d7b\") " pod="openstack/ovn-controller-ovs-bwrv9" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.795730 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/159ee554-1b62-4fe3-95c6-e64ab0c58b2d-var-log-ovn\") pod \"ovn-controller-56wgh\" (UID: \"159ee554-1b62-4fe3-95c6-e64ab0c58b2d\") " pod="openstack/ovn-controller-56wgh" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.897759 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8735bd7c-231f-47df-a404-b8cab84f0d7b-var-run\") pod \"ovn-controller-ovs-bwrv9\" (UID: \"8735bd7c-231f-47df-a404-b8cab84f0d7b\") " pod="openstack/ovn-controller-ovs-bwrv9" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.897830 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/159ee554-1b62-4fe3-95c6-e64ab0c58b2d-scripts\") pod \"ovn-controller-56wgh\" (UID: \"159ee554-1b62-4fe3-95c6-e64ab0c58b2d\") " pod="openstack/ovn-controller-56wgh" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.897894 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8735bd7c-231f-47df-a404-b8cab84f0d7b-var-log\") pod \"ovn-controller-ovs-bwrv9\" (UID: \"8735bd7c-231f-47df-a404-b8cab84f0d7b\") " pod="openstack/ovn-controller-ovs-bwrv9" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.897927 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/159ee554-1b62-4fe3-95c6-e64ab0c58b2d-combined-ca-bundle\") pod \"ovn-controller-56wgh\" (UID: \"159ee554-1b62-4fe3-95c6-e64ab0c58b2d\") " pod="openstack/ovn-controller-56wgh" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.897978 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/159ee554-1b62-4fe3-95c6-e64ab0c58b2d-var-run\") pod \"ovn-controller-56wgh\" (UID: \"159ee554-1b62-4fe3-95c6-e64ab0c58b2d\") " pod="openstack/ovn-controller-56wgh" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.898027 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4xx4\" (UniqueName: \"kubernetes.io/projected/8735bd7c-231f-47df-a404-b8cab84f0d7b-kube-api-access-n4xx4\") pod \"ovn-controller-ovs-bwrv9\" (UID: \"8735bd7c-231f-47df-a404-b8cab84f0d7b\") " pod="openstack/ovn-controller-ovs-bwrv9" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.898068 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/159ee554-1b62-4fe3-95c6-e64ab0c58b2d-var-log-ovn\") pod \"ovn-controller-56wgh\" (UID: \"159ee554-1b62-4fe3-95c6-e64ab0c58b2d\") " pod="openstack/ovn-controller-56wgh" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.898127 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/159ee554-1b62-4fe3-95c6-e64ab0c58b2d-ovn-controller-tls-certs\") pod \"ovn-controller-56wgh\" (UID: \"159ee554-1b62-4fe3-95c6-e64ab0c58b2d\") " pod="openstack/ovn-controller-56wgh" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.898163 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/159ee554-1b62-4fe3-95c6-e64ab0c58b2d-var-run-ovn\") pod \"ovn-controller-56wgh\" (UID: \"159ee554-1b62-4fe3-95c6-e64ab0c58b2d\") " pod="openstack/ovn-controller-56wgh" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.898214 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8735bd7c-231f-47df-a404-b8cab84f0d7b-etc-ovs\") pod \"ovn-controller-ovs-bwrv9\" (UID: \"8735bd7c-231f-47df-a404-b8cab84f0d7b\") " pod="openstack/ovn-controller-ovs-bwrv9" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.898242 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47vmt\" (UniqueName: \"kubernetes.io/projected/159ee554-1b62-4fe3-95c6-e64ab0c58b2d-kube-api-access-47vmt\") pod \"ovn-controller-56wgh\" (UID: \"159ee554-1b62-4fe3-95c6-e64ab0c58b2d\") " pod="openstack/ovn-controller-56wgh" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.898294 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8735bd7c-231f-47df-a404-b8cab84f0d7b-var-lib\") pod \"ovn-controller-ovs-bwrv9\" (UID: \"8735bd7c-231f-47df-a404-b8cab84f0d7b\") " pod="openstack/ovn-controller-ovs-bwrv9" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.898359 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8735bd7c-231f-47df-a404-b8cab84f0d7b-scripts\") pod \"ovn-controller-ovs-bwrv9\" (UID: \"8735bd7c-231f-47df-a404-b8cab84f0d7b\") " pod="openstack/ovn-controller-ovs-bwrv9" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.898382 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8735bd7c-231f-47df-a404-b8cab84f0d7b-var-run\") pod \"ovn-controller-ovs-bwrv9\" (UID: \"8735bd7c-231f-47df-a404-b8cab84f0d7b\") " pod="openstack/ovn-controller-ovs-bwrv9" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.898507 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/159ee554-1b62-4fe3-95c6-e64ab0c58b2d-var-run\") pod \"ovn-controller-56wgh\" (UID: \"159ee554-1b62-4fe3-95c6-e64ab0c58b2d\") " pod="openstack/ovn-controller-56wgh" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.899400 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8735bd7c-231f-47df-a404-b8cab84f0d7b-etc-ovs\") pod \"ovn-controller-ovs-bwrv9\" (UID: \"8735bd7c-231f-47df-a404-b8cab84f0d7b\") " pod="openstack/ovn-controller-ovs-bwrv9" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.899885 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8735bd7c-231f-47df-a404-b8cab84f0d7b-var-lib\") pod \"ovn-controller-ovs-bwrv9\" (UID: \"8735bd7c-231f-47df-a404-b8cab84f0d7b\") " pod="openstack/ovn-controller-ovs-bwrv9" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.900003 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/159ee554-1b62-4fe3-95c6-e64ab0c58b2d-var-run-ovn\") pod \"ovn-controller-56wgh\" (UID: \"159ee554-1b62-4fe3-95c6-e64ab0c58b2d\") " pod="openstack/ovn-controller-56wgh" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.900135 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8735bd7c-231f-47df-a404-b8cab84f0d7b-var-log\") pod \"ovn-controller-ovs-bwrv9\" (UID: \"8735bd7c-231f-47df-a404-b8cab84f0d7b\") " pod="openstack/ovn-controller-ovs-bwrv9" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.900353 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/159ee554-1b62-4fe3-95c6-e64ab0c58b2d-var-log-ovn\") pod \"ovn-controller-56wgh\" (UID: \"159ee554-1b62-4fe3-95c6-e64ab0c58b2d\") " pod="openstack/ovn-controller-56wgh" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.900569 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/159ee554-1b62-4fe3-95c6-e64ab0c58b2d-scripts\") pod \"ovn-controller-56wgh\" (UID: \"159ee554-1b62-4fe3-95c6-e64ab0c58b2d\") " pod="openstack/ovn-controller-56wgh" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.903022 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/159ee554-1b62-4fe3-95c6-e64ab0c58b2d-ovn-controller-tls-certs\") pod \"ovn-controller-56wgh\" (UID: \"159ee554-1b62-4fe3-95c6-e64ab0c58b2d\") " pod="openstack/ovn-controller-56wgh" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.904111 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8735bd7c-231f-47df-a404-b8cab84f0d7b-scripts\") pod \"ovn-controller-ovs-bwrv9\" (UID: \"8735bd7c-231f-47df-a404-b8cab84f0d7b\") " pod="openstack/ovn-controller-ovs-bwrv9" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.904236 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/159ee554-1b62-4fe3-95c6-e64ab0c58b2d-combined-ca-bundle\") pod \"ovn-controller-56wgh\" (UID: \"159ee554-1b62-4fe3-95c6-e64ab0c58b2d\") " pod="openstack/ovn-controller-56wgh" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.919800 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4xx4\" (UniqueName: \"kubernetes.io/projected/8735bd7c-231f-47df-a404-b8cab84f0d7b-kube-api-access-n4xx4\") pod \"ovn-controller-ovs-bwrv9\" (UID: \"8735bd7c-231f-47df-a404-b8cab84f0d7b\") " pod="openstack/ovn-controller-ovs-bwrv9" Sep 30 07:48:54 crc kubenswrapper[4760]: I0930 07:48:54.922529 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47vmt\" (UniqueName: \"kubernetes.io/projected/159ee554-1b62-4fe3-95c6-e64ab0c58b2d-kube-api-access-47vmt\") pod \"ovn-controller-56wgh\" (UID: \"159ee554-1b62-4fe3-95c6-e64ab0c58b2d\") " pod="openstack/ovn-controller-56wgh" Sep 30 07:48:55 crc kubenswrapper[4760]: I0930 07:48:55.005442 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-56wgh" Sep 30 07:48:55 crc kubenswrapper[4760]: I0930 07:48:55.073026 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bwrv9" Sep 30 07:48:56 crc kubenswrapper[4760]: I0930 07:48:56.927549 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 07:48:56 crc kubenswrapper[4760]: I0930 07:48:56.929281 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 07:48:56 crc kubenswrapper[4760]: I0930 07:48:56.931900 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Sep 30 07:48:56 crc kubenswrapper[4760]: I0930 07:48:56.932278 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Sep 30 07:48:56 crc kubenswrapper[4760]: I0930 07:48:56.932523 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Sep 30 07:48:56 crc kubenswrapper[4760]: I0930 07:48:56.932730 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-nqf97" Sep 30 07:48:56 crc kubenswrapper[4760]: I0930 07:48:56.937867 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 07:48:57 crc kubenswrapper[4760]: I0930 07:48:57.040051 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d25b3b00-98d6-4bfc-8218-9ea7319e1c60-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d25b3b00-98d6-4bfc-8218-9ea7319e1c60\") " pod="openstack/ovsdbserver-nb-0" Sep 30 07:48:57 crc kubenswrapper[4760]: I0930 07:48:57.040154 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gkpw\" (UniqueName: \"kubernetes.io/projected/d25b3b00-98d6-4bfc-8218-9ea7319e1c60-kube-api-access-7gkpw\") pod \"ovsdbserver-nb-0\" (UID: \"d25b3b00-98d6-4bfc-8218-9ea7319e1c60\") " pod="openstack/ovsdbserver-nb-0" Sep 30 07:48:57 crc kubenswrapper[4760]: I0930 07:48:57.040192 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d25b3b00-98d6-4bfc-8218-9ea7319e1c60-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d25b3b00-98d6-4bfc-8218-9ea7319e1c60\") " pod="openstack/ovsdbserver-nb-0" Sep 30 07:48:57 crc kubenswrapper[4760]: I0930 07:48:57.040214 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d25b3b00-98d6-4bfc-8218-9ea7319e1c60\") " pod="openstack/ovsdbserver-nb-0" Sep 30 07:48:57 crc kubenswrapper[4760]: I0930 07:48:57.040230 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d25b3b00-98d6-4bfc-8218-9ea7319e1c60-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d25b3b00-98d6-4bfc-8218-9ea7319e1c60\") " pod="openstack/ovsdbserver-nb-0" Sep 30 07:48:57 crc kubenswrapper[4760]: I0930 07:48:57.040243 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d25b3b00-98d6-4bfc-8218-9ea7319e1c60-config\") pod \"ovsdbserver-nb-0\" (UID: \"d25b3b00-98d6-4bfc-8218-9ea7319e1c60\") " pod="openstack/ovsdbserver-nb-0" Sep 30 07:48:57 crc kubenswrapper[4760]: I0930 07:48:57.040571 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d25b3b00-98d6-4bfc-8218-9ea7319e1c60-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d25b3b00-98d6-4bfc-8218-9ea7319e1c60\") " pod="openstack/ovsdbserver-nb-0" Sep 30 07:48:57 crc kubenswrapper[4760]: I0930 07:48:57.040619 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d25b3b00-98d6-4bfc-8218-9ea7319e1c60-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d25b3b00-98d6-4bfc-8218-9ea7319e1c60\") " pod="openstack/ovsdbserver-nb-0" Sep 30 07:48:57 crc kubenswrapper[4760]: I0930 07:48:57.141406 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d25b3b00-98d6-4bfc-8218-9ea7319e1c60-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d25b3b00-98d6-4bfc-8218-9ea7319e1c60\") " pod="openstack/ovsdbserver-nb-0" Sep 30 07:48:57 crc kubenswrapper[4760]: I0930 07:48:57.141702 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d25b3b00-98d6-4bfc-8218-9ea7319e1c60-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d25b3b00-98d6-4bfc-8218-9ea7319e1c60\") " pod="openstack/ovsdbserver-nb-0" Sep 30 07:48:57 crc kubenswrapper[4760]: I0930 07:48:57.141752 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d25b3b00-98d6-4bfc-8218-9ea7319e1c60-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d25b3b00-98d6-4bfc-8218-9ea7319e1c60\") " pod="openstack/ovsdbserver-nb-0" Sep 30 07:48:57 crc kubenswrapper[4760]: I0930 07:48:57.141785 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gkpw\" (UniqueName: \"kubernetes.io/projected/d25b3b00-98d6-4bfc-8218-9ea7319e1c60-kube-api-access-7gkpw\") pod \"ovsdbserver-nb-0\" (UID: \"d25b3b00-98d6-4bfc-8218-9ea7319e1c60\") " pod="openstack/ovsdbserver-nb-0" Sep 30 07:48:57 crc kubenswrapper[4760]: I0930 07:48:57.141806 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d25b3b00-98d6-4bfc-8218-9ea7319e1c60-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d25b3b00-98d6-4bfc-8218-9ea7319e1c60\") " pod="openstack/ovsdbserver-nb-0" Sep 30 07:48:57 crc kubenswrapper[4760]: I0930 07:48:57.141827 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d25b3b00-98d6-4bfc-8218-9ea7319e1c60\") " pod="openstack/ovsdbserver-nb-0" Sep 30 07:48:57 crc kubenswrapper[4760]: I0930 07:48:57.141844 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d25b3b00-98d6-4bfc-8218-9ea7319e1c60-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d25b3b00-98d6-4bfc-8218-9ea7319e1c60\") " pod="openstack/ovsdbserver-nb-0" Sep 30 07:48:57 crc kubenswrapper[4760]: I0930 07:48:57.141862 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d25b3b00-98d6-4bfc-8218-9ea7319e1c60-config\") pod \"ovsdbserver-nb-0\" (UID: \"d25b3b00-98d6-4bfc-8218-9ea7319e1c60\") " pod="openstack/ovsdbserver-nb-0" Sep 30 07:48:57 crc kubenswrapper[4760]: I0930 07:48:57.142208 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d25b3b00-98d6-4bfc-8218-9ea7319e1c60\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-nb-0" Sep 30 07:48:57 crc kubenswrapper[4760]: I0930 07:48:57.142788 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d25b3b00-98d6-4bfc-8218-9ea7319e1c60-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d25b3b00-98d6-4bfc-8218-9ea7319e1c60\") " pod="openstack/ovsdbserver-nb-0" Sep 30 07:48:57 crc kubenswrapper[4760]: I0930 07:48:57.142886 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d25b3b00-98d6-4bfc-8218-9ea7319e1c60-config\") pod \"ovsdbserver-nb-0\" (UID: \"d25b3b00-98d6-4bfc-8218-9ea7319e1c60\") " pod="openstack/ovsdbserver-nb-0" Sep 30 07:48:57 crc kubenswrapper[4760]: I0930 07:48:57.143809 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d25b3b00-98d6-4bfc-8218-9ea7319e1c60-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d25b3b00-98d6-4bfc-8218-9ea7319e1c60\") " pod="openstack/ovsdbserver-nb-0" Sep 30 07:48:57 crc kubenswrapper[4760]: I0930 07:48:57.148155 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d25b3b00-98d6-4bfc-8218-9ea7319e1c60-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d25b3b00-98d6-4bfc-8218-9ea7319e1c60\") " pod="openstack/ovsdbserver-nb-0" Sep 30 07:48:57 crc kubenswrapper[4760]: I0930 07:48:57.148828 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d25b3b00-98d6-4bfc-8218-9ea7319e1c60-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d25b3b00-98d6-4bfc-8218-9ea7319e1c60\") " pod="openstack/ovsdbserver-nb-0" Sep 30 07:48:57 crc kubenswrapper[4760]: I0930 07:48:57.153219 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d25b3b00-98d6-4bfc-8218-9ea7319e1c60-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d25b3b00-98d6-4bfc-8218-9ea7319e1c60\") " pod="openstack/ovsdbserver-nb-0" Sep 30 07:48:57 crc kubenswrapper[4760]: I0930 07:48:57.159029 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gkpw\" (UniqueName: \"kubernetes.io/projected/d25b3b00-98d6-4bfc-8218-9ea7319e1c60-kube-api-access-7gkpw\") pod \"ovsdbserver-nb-0\" (UID: \"d25b3b00-98d6-4bfc-8218-9ea7319e1c60\") " pod="openstack/ovsdbserver-nb-0" Sep 30 07:48:57 crc kubenswrapper[4760]: I0930 07:48:57.166053 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d25b3b00-98d6-4bfc-8218-9ea7319e1c60\") " pod="openstack/ovsdbserver-nb-0" Sep 30 07:48:57 crc kubenswrapper[4760]: I0930 07:48:57.251353 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 07:48:59 crc kubenswrapper[4760]: I0930 07:48:59.484340 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 07:48:59 crc kubenswrapper[4760]: I0930 07:48:59.574487 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9fxn6" event={"ID":"d6258f1d-30e6-4da1-b567-e37509041f6f","Type":"ContainerStarted","Data":"cd9fab081c150991be83f649bf7d0c1dde39f44100fbc7d5e5eaa469eed638c8"} Sep 30 07:48:59 crc kubenswrapper[4760]: I0930 07:48:59.823416 4760 scope.go:117] "RemoveContainer" containerID="b979f0e0ff859a4b2434fbec37384b18a5cb6f716386096a1c2f8126ce5c81a4" Sep 30 07:48:59 crc kubenswrapper[4760]: E0930 07:48:59.866877 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Sep 30 07:48:59 crc kubenswrapper[4760]: E0930 07:48:59.867094 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t25zg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-8t257_openstack(230224f8-bc7e-491d-8545-ce88622bd97d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 07:48:59 crc kubenswrapper[4760]: E0930 07:48:59.868243 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-8t257" podUID="230224f8-bc7e-491d-8545-ce88622bd97d" Sep 30 07:48:59 crc kubenswrapper[4760]: E0930 07:48:59.911418 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Sep 30 07:48:59 crc kubenswrapper[4760]: E0930 07:48:59.911591 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vzpxk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-bzpjr_openstack(5318712b-bf53-4f94-9abd-4e1b97e8311a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 07:48:59 crc kubenswrapper[4760]: E0930 07:48:59.913360 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-bzpjr" podUID="5318712b-bf53-4f94-9abd-4e1b97e8311a" Sep 30 07:49:00 crc kubenswrapper[4760]: I0930 07:49:00.600013 4760 generic.go:334] "Generic (PLEG): container finished" podID="d6258f1d-30e6-4da1-b567-e37509041f6f" containerID="68e5cf3102c89318e7520f6ebb8202a77c277703d76133534cc202e1d139e965" exitCode=0 Sep 30 07:49:00 crc kubenswrapper[4760]: I0930 07:49:00.600520 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9fxn6" event={"ID":"d6258f1d-30e6-4da1-b567-e37509041f6f","Type":"ContainerDied","Data":"68e5cf3102c89318e7520f6ebb8202a77c277703d76133534cc202e1d139e965"} Sep 30 07:49:00 crc kubenswrapper[4760]: I0930 07:49:00.602086 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"888bbd15-0d32-47ca-9f81-94eaf8f3c4df","Type":"ContainerStarted","Data":"0a065cdb91978c59cd10d49cd43725b135a52e29266c8c13fece95c6c52b40e2"} Sep 30 07:49:00 crc kubenswrapper[4760]: I0930 07:49:00.611499 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 07:49:00 crc kubenswrapper[4760]: I0930 07:49:00.615683 4760 generic.go:334] "Generic (PLEG): container finished" podID="4f6db3f0-59be-48d1-a2a7-8302680b2f4c" containerID="e323429a8d8d8892374a70e4060ab0156b081244923f13eaa8c8c233d118f092" exitCode=0 Sep 30 07:49:00 crc kubenswrapper[4760]: I0930 07:49:00.616650 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-ccxrw" event={"ID":"4f6db3f0-59be-48d1-a2a7-8302680b2f4c","Type":"ContainerDied","Data":"e323429a8d8d8892374a70e4060ab0156b081244923f13eaa8c8c233d118f092"} Sep 30 07:49:00 crc kubenswrapper[4760]: I0930 07:49:00.632664 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 07:49:00 crc kubenswrapper[4760]: I0930 07:49:00.637598 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 07:49:00 crc kubenswrapper[4760]: I0930 07:49:00.650659 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 07:49:00 crc kubenswrapper[4760]: I0930 07:49:00.666399 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-56wgh"] Sep 30 07:49:00 crc kubenswrapper[4760]: I0930 07:49:00.671037 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 07:49:00 crc kubenswrapper[4760]: I0930 07:49:00.683245 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 07:49:00 crc kubenswrapper[4760]: I0930 07:49:00.809129 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 07:49:00 crc kubenswrapper[4760]: I0930 07:49:00.933224 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.052066 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-8t257" Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.149773 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/230224f8-bc7e-491d-8545-ce88622bd97d-config\") pod \"230224f8-bc7e-491d-8545-ce88622bd97d\" (UID: \"230224f8-bc7e-491d-8545-ce88622bd97d\") " Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.149890 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t25zg\" (UniqueName: \"kubernetes.io/projected/230224f8-bc7e-491d-8545-ce88622bd97d-kube-api-access-t25zg\") pod \"230224f8-bc7e-491d-8545-ce88622bd97d\" (UID: \"230224f8-bc7e-491d-8545-ce88622bd97d\") " Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.150290 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/230224f8-bc7e-491d-8545-ce88622bd97d-config" (OuterVolumeSpecName: "config") pod "230224f8-bc7e-491d-8545-ce88622bd97d" (UID: "230224f8-bc7e-491d-8545-ce88622bd97d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.151082 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/230224f8-bc7e-491d-8545-ce88622bd97d-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.194868 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/230224f8-bc7e-491d-8545-ce88622bd97d-kube-api-access-t25zg" (OuterVolumeSpecName: "kube-api-access-t25zg") pod "230224f8-bc7e-491d-8545-ce88622bd97d" (UID: "230224f8-bc7e-491d-8545-ce88622bd97d"). InnerVolumeSpecName "kube-api-access-t25zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.232711 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-bzpjr" Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.253033 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t25zg\" (UniqueName: \"kubernetes.io/projected/230224f8-bc7e-491d-8545-ce88622bd97d-kube-api-access-t25zg\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.354059 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5318712b-bf53-4f94-9abd-4e1b97e8311a-dns-svc\") pod \"5318712b-bf53-4f94-9abd-4e1b97e8311a\" (UID: \"5318712b-bf53-4f94-9abd-4e1b97e8311a\") " Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.354357 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5318712b-bf53-4f94-9abd-4e1b97e8311a-config\") pod \"5318712b-bf53-4f94-9abd-4e1b97e8311a\" (UID: \"5318712b-bf53-4f94-9abd-4e1b97e8311a\") " Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.354539 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzpxk\" (UniqueName: \"kubernetes.io/projected/5318712b-bf53-4f94-9abd-4e1b97e8311a-kube-api-access-vzpxk\") pod \"5318712b-bf53-4f94-9abd-4e1b97e8311a\" (UID: \"5318712b-bf53-4f94-9abd-4e1b97e8311a\") " Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.354821 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5318712b-bf53-4f94-9abd-4e1b97e8311a-config" (OuterVolumeSpecName: "config") pod "5318712b-bf53-4f94-9abd-4e1b97e8311a" (UID: "5318712b-bf53-4f94-9abd-4e1b97e8311a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.354816 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5318712b-bf53-4f94-9abd-4e1b97e8311a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5318712b-bf53-4f94-9abd-4e1b97e8311a" (UID: "5318712b-bf53-4f94-9abd-4e1b97e8311a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.358340 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5318712b-bf53-4f94-9abd-4e1b97e8311a-kube-api-access-vzpxk" (OuterVolumeSpecName: "kube-api-access-vzpxk") pod "5318712b-bf53-4f94-9abd-4e1b97e8311a" (UID: "5318712b-bf53-4f94-9abd-4e1b97e8311a"). InnerVolumeSpecName "kube-api-access-vzpxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.456371 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5318712b-bf53-4f94-9abd-4e1b97e8311a-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.456406 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzpxk\" (UniqueName: \"kubernetes.io/projected/5318712b-bf53-4f94-9abd-4e1b97e8311a-kube-api-access-vzpxk\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.456416 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5318712b-bf53-4f94-9abd-4e1b97e8311a-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.623721 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"894abb89-f647-4143-904c-88b5108982cd","Type":"ContainerStarted","Data":"7ad6672314869fd567bb9aa50a9e93409cf818d954872d1a877eabd6bed2a92d"} Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.625096 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"82b71e6c-ab34-447e-87e0-a95a9f070efe","Type":"ContainerStarted","Data":"1773a95e636378477ff3ac5f0e1367a058d156fbd8e076ab13e602f4bbee54e8"} Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.626900 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d2f7adda-d2ed-4c87-8e63-64e344155305","Type":"ContainerStarted","Data":"04c841c56158414eafb923f7a114ad9de99c28702e28e750f71991138e5e931a"} Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.628948 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-56wgh" event={"ID":"159ee554-1b62-4fe3-95c6-e64ab0c58b2d","Type":"ContainerStarted","Data":"10a7c118dbf5fd6b8fab554381aa16b93060c48cda37bcbb6e8658f0621fc05d"} Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.631145 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-8t257" Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.631152 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-8t257" event={"ID":"230224f8-bc7e-491d-8545-ce88622bd97d","Type":"ContainerDied","Data":"2b6e2744bf9f83f053d60f0888131a4aeb692782d2d01f5af824a40c9aff0f47"} Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.644227 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"23bad0af-c21e-4ba1-bc39-39c48f0fea56","Type":"ContainerStarted","Data":"a7978639ab6b98dc70f658a0441f8e021e96ea6fab411af0b2ccce15d232ff62"} Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.649170 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-ccxrw" event={"ID":"4f6db3f0-59be-48d1-a2a7-8302680b2f4c","Type":"ContainerStarted","Data":"ecf9525bb497f231e4d7ae0676cdeb79bfbac57485616397ac969dc578b52c48"} Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.649329 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-ccxrw" Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.650442 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d25b3b00-98d6-4bfc-8218-9ea7319e1c60","Type":"ContainerStarted","Data":"1ff0d78cc0402656eb51ec4d2c6d2ccb9d7379dc774d119a336e3dba0cd001e3"} Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.651711 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-bzpjr" event={"ID":"5318712b-bf53-4f94-9abd-4e1b97e8311a","Type":"ContainerDied","Data":"02c0b2e3dc96917c787723f8e2725cb62524ed236e5478c97dc1d1bbc5e56dba"} Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.651767 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-bzpjr" Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.653340 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"641818bf-a81e-4654-a8f7-c8d06fbefc6c","Type":"ContainerStarted","Data":"8a1b661f06bbeb86c6044350193942e474e372a6be3c130f92083a9931064031"} Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.655769 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9fxn6" event={"ID":"d6258f1d-30e6-4da1-b567-e37509041f6f","Type":"ContainerStarted","Data":"ff7c0164e34617614a75b6a09d049e542f7bbaf9f3e1fc30287f9c2bfb0ea044"} Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.655863 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-9fxn6" Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.657035 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"598c1476-b9fa-48c1-a346-80e23448d00f","Type":"ContainerStarted","Data":"3d5b6ae4dcdea4099a722cf87fcaf0a56e380e2374e12d61e273dfd0a432a215"} Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.665702 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"007888b6-d5c6-410a-955a-ed78adf759bd","Type":"ContainerStarted","Data":"7b3066eb4677a9d3e7086f1578421e5208f678caca29d8e696ff03175933c15b"} Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.685691 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8t257"] Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.693046 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8t257"] Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.699101 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-9fxn6" podStartSLOduration=17.656653939999998 podStartE2EDuration="18.699089344s" podCreationTimestamp="2025-09-30 07:48:43 +0000 UTC" firstStartedPulling="2025-09-30 07:48:58.967477622 +0000 UTC m=+924.610384034" lastFinishedPulling="2025-09-30 07:49:00.009913026 +0000 UTC m=+925.652819438" observedRunningTime="2025-09-30 07:49:01.691260954 +0000 UTC m=+927.334167366" watchObservedRunningTime="2025-09-30 07:49:01.699089344 +0000 UTC m=+927.341995746" Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.712693 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-ccxrw" podStartSLOduration=12.966781271 podStartE2EDuration="18.71267772s" podCreationTimestamp="2025-09-30 07:48:43 +0000 UTC" firstStartedPulling="2025-09-30 07:48:54.255607353 +0000 UTC m=+919.898513765" lastFinishedPulling="2025-09-30 07:49:00.001503792 +0000 UTC m=+925.644410214" observedRunningTime="2025-09-30 07:49:01.711584112 +0000 UTC m=+927.354490524" watchObservedRunningTime="2025-09-30 07:49:01.71267772 +0000 UTC m=+927.355584132" Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.743438 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bzpjr"] Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.745140 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bzpjr"] Sep 30 07:49:01 crc kubenswrapper[4760]: I0930 07:49:01.869747 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bwrv9"] Sep 30 07:49:03 crc kubenswrapper[4760]: I0930 07:49:03.076067 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="230224f8-bc7e-491d-8545-ce88622bd97d" path="/var/lib/kubelet/pods/230224f8-bc7e-491d-8545-ce88622bd97d/volumes" Sep 30 07:49:03 crc kubenswrapper[4760]: I0930 07:49:03.076489 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5318712b-bf53-4f94-9abd-4e1b97e8311a" path="/var/lib/kubelet/pods/5318712b-bf53-4f94-9abd-4e1b97e8311a/volumes" Sep 30 07:49:03 crc kubenswrapper[4760]: W0930 07:49:03.382550 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8735bd7c_231f_47df_a404_b8cab84f0d7b.slice/crio-19cd80779c50ba6070d17f25e862ce48607ddc71795c55745feb10b87f5a86a1 WatchSource:0}: Error finding container 19cd80779c50ba6070d17f25e862ce48607ddc71795c55745feb10b87f5a86a1: Status 404 returned error can't find the container with id 19cd80779c50ba6070d17f25e862ce48607ddc71795c55745feb10b87f5a86a1 Sep 30 07:49:03 crc kubenswrapper[4760]: I0930 07:49:03.680540 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bwrv9" event={"ID":"8735bd7c-231f-47df-a404-b8cab84f0d7b","Type":"ContainerStarted","Data":"19cd80779c50ba6070d17f25e862ce48607ddc71795c55745feb10b87f5a86a1"} Sep 30 07:49:08 crc kubenswrapper[4760]: I0930 07:49:08.718743 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-ccxrw" Sep 30 07:49:09 crc kubenswrapper[4760]: I0930 07:49:09.056213 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-9fxn6" Sep 30 07:49:09 crc kubenswrapper[4760]: I0930 07:49:09.105764 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ccxrw"] Sep 30 07:49:09 crc kubenswrapper[4760]: I0930 07:49:09.724091 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-ccxrw" podUID="4f6db3f0-59be-48d1-a2a7-8302680b2f4c" containerName="dnsmasq-dns" containerID="cri-o://ecf9525bb497f231e4d7ae0676cdeb79bfbac57485616397ac969dc578b52c48" gracePeriod=10 Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.351159 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-ccxrw" Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.472364 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f6db3f0-59be-48d1-a2a7-8302680b2f4c-config\") pod \"4f6db3f0-59be-48d1-a2a7-8302680b2f4c\" (UID: \"4f6db3f0-59be-48d1-a2a7-8302680b2f4c\") " Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.472593 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f6db3f0-59be-48d1-a2a7-8302680b2f4c-dns-svc\") pod \"4f6db3f0-59be-48d1-a2a7-8302680b2f4c\" (UID: \"4f6db3f0-59be-48d1-a2a7-8302680b2f4c\") " Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.472721 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd6qt\" (UniqueName: \"kubernetes.io/projected/4f6db3f0-59be-48d1-a2a7-8302680b2f4c-kube-api-access-xd6qt\") pod \"4f6db3f0-59be-48d1-a2a7-8302680b2f4c\" (UID: \"4f6db3f0-59be-48d1-a2a7-8302680b2f4c\") " Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.537708 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f6db3f0-59be-48d1-a2a7-8302680b2f4c-kube-api-access-xd6qt" (OuterVolumeSpecName: "kube-api-access-xd6qt") pod "4f6db3f0-59be-48d1-a2a7-8302680b2f4c" (UID: "4f6db3f0-59be-48d1-a2a7-8302680b2f4c"). InnerVolumeSpecName "kube-api-access-xd6qt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.575283 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd6qt\" (UniqueName: \"kubernetes.io/projected/4f6db3f0-59be-48d1-a2a7-8302680b2f4c-kube-api-access-xd6qt\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.674801 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f6db3f0-59be-48d1-a2a7-8302680b2f4c-config" (OuterVolumeSpecName: "config") pod "4f6db3f0-59be-48d1-a2a7-8302680b2f4c" (UID: "4f6db3f0-59be-48d1-a2a7-8302680b2f4c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.678462 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f6db3f0-59be-48d1-a2a7-8302680b2f4c-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.731804 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d25b3b00-98d6-4bfc-8218-9ea7319e1c60","Type":"ContainerStarted","Data":"1d1b739938c1e5093964ec633e02c60c67a0847c3627d5b1a3d73abc56dc349f"} Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.733947 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d2f7adda-d2ed-4c87-8e63-64e344155305","Type":"ContainerStarted","Data":"2b3b0b7c21f7ec6b8e3a9da00a261cfb8a5926fd1d80c607a1275393288fbddc"} Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.735709 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bwrv9" event={"ID":"8735bd7c-231f-47df-a404-b8cab84f0d7b","Type":"ContainerStarted","Data":"bb5c783116409821c8d036cd4624b907aaab1ab8f5f4d0871fa459f26babe319"} Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.737603 4760 generic.go:334] "Generic (PLEG): container finished" podID="4f6db3f0-59be-48d1-a2a7-8302680b2f4c" containerID="ecf9525bb497f231e4d7ae0676cdeb79bfbac57485616397ac969dc578b52c48" exitCode=0 Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.737648 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-ccxrw" Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.737677 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-ccxrw" event={"ID":"4f6db3f0-59be-48d1-a2a7-8302680b2f4c","Type":"ContainerDied","Data":"ecf9525bb497f231e4d7ae0676cdeb79bfbac57485616397ac969dc578b52c48"} Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.737711 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-ccxrw" event={"ID":"4f6db3f0-59be-48d1-a2a7-8302680b2f4c","Type":"ContainerDied","Data":"8834c44b95935ffd49c473721082dbfb8c1154c6357bfc08919770e948bcfb03"} Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.737737 4760 scope.go:117] "RemoveContainer" containerID="ecf9525bb497f231e4d7ae0676cdeb79bfbac57485616397ac969dc578b52c48" Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.742037 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"82b71e6c-ab34-447e-87e0-a95a9f070efe","Type":"ContainerStarted","Data":"6b118dc533d1475f2056129842cbda4e9708447c504c86b0c22d38fcd2a5b9b2"} Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.748629 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"641818bf-a81e-4654-a8f7-c8d06fbefc6c","Type":"ContainerStarted","Data":"4d4406eaa181c26bb7e05fb6765d4c05c8f631bab44cc919a27bffd7201e1be6"} Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.750588 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"598c1476-b9fa-48c1-a346-80e23448d00f","Type":"ContainerStarted","Data":"03212bd3449533002b5d0f55dd9e277ab96ffe5938587f2e5be2ce5d85d87ebe"} Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.750724 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.752544 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"894abb89-f647-4143-904c-88b5108982cd","Type":"ContainerStarted","Data":"d01c95707eb74c03adda0c5fdca708634feb4b5d0e2e7f8860a1044ea8810361"} Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.754322 4760 scope.go:117] "RemoveContainer" containerID="e323429a8d8d8892374a70e4060ab0156b081244923f13eaa8c8c233d118f092" Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.757335 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"888bbd15-0d32-47ca-9f81-94eaf8f3c4df","Type":"ContainerStarted","Data":"039b885c04db48743019cb8fd332d719e0236e3f107ba92ef62a4f193fe33d92"} Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.775625 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=15.615941534 podStartE2EDuration="23.775610869s" podCreationTimestamp="2025-09-30 07:48:47 +0000 UTC" firstStartedPulling="2025-09-30 07:49:00.654512175 +0000 UTC m=+926.297418587" lastFinishedPulling="2025-09-30 07:49:08.8141815 +0000 UTC m=+934.457087922" observedRunningTime="2025-09-30 07:49:10.771695729 +0000 UTC m=+936.414602141" watchObservedRunningTime="2025-09-30 07:49:10.775610869 +0000 UTC m=+936.418517281" Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.799875 4760 scope.go:117] "RemoveContainer" containerID="ecf9525bb497f231e4d7ae0676cdeb79bfbac57485616397ac969dc578b52c48" Sep 30 07:49:10 crc kubenswrapper[4760]: E0930 07:49:10.800563 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecf9525bb497f231e4d7ae0676cdeb79bfbac57485616397ac969dc578b52c48\": container with ID starting with ecf9525bb497f231e4d7ae0676cdeb79bfbac57485616397ac969dc578b52c48 not found: ID does not exist" containerID="ecf9525bb497f231e4d7ae0676cdeb79bfbac57485616397ac969dc578b52c48" Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.800608 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecf9525bb497f231e4d7ae0676cdeb79bfbac57485616397ac969dc578b52c48"} err="failed to get container status \"ecf9525bb497f231e4d7ae0676cdeb79bfbac57485616397ac969dc578b52c48\": rpc error: code = NotFound desc = could not find container \"ecf9525bb497f231e4d7ae0676cdeb79bfbac57485616397ac969dc578b52c48\": container with ID starting with ecf9525bb497f231e4d7ae0676cdeb79bfbac57485616397ac969dc578b52c48 not found: ID does not exist" Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.800636 4760 scope.go:117] "RemoveContainer" containerID="e323429a8d8d8892374a70e4060ab0156b081244923f13eaa8c8c233d118f092" Sep 30 07:49:10 crc kubenswrapper[4760]: E0930 07:49:10.800907 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e323429a8d8d8892374a70e4060ab0156b081244923f13eaa8c8c233d118f092\": container with ID starting with e323429a8d8d8892374a70e4060ab0156b081244923f13eaa8c8c233d118f092 not found: ID does not exist" containerID="e323429a8d8d8892374a70e4060ab0156b081244923f13eaa8c8c233d118f092" Sep 30 07:49:10 crc kubenswrapper[4760]: I0930 07:49:10.800940 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e323429a8d8d8892374a70e4060ab0156b081244923f13eaa8c8c233d118f092"} err="failed to get container status \"e323429a8d8d8892374a70e4060ab0156b081244923f13eaa8c8c233d118f092\": rpc error: code = NotFound desc = could not find container \"e323429a8d8d8892374a70e4060ab0156b081244923f13eaa8c8c233d118f092\": container with ID starting with e323429a8d8d8892374a70e4060ab0156b081244923f13eaa8c8c233d118f092 not found: ID does not exist" Sep 30 07:49:11 crc kubenswrapper[4760]: I0930 07:49:11.189255 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f6db3f0-59be-48d1-a2a7-8302680b2f4c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4f6db3f0-59be-48d1-a2a7-8302680b2f4c" (UID: "4f6db3f0-59be-48d1-a2a7-8302680b2f4c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:49:11 crc kubenswrapper[4760]: I0930 07:49:11.287983 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f6db3f0-59be-48d1-a2a7-8302680b2f4c-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:11 crc kubenswrapper[4760]: I0930 07:49:11.365616 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ccxrw"] Sep 30 07:49:11 crc kubenswrapper[4760]: I0930 07:49:11.374016 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ccxrw"] Sep 30 07:49:11 crc kubenswrapper[4760]: I0930 07:49:11.770248 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-56wgh" event={"ID":"159ee554-1b62-4fe3-95c6-e64ab0c58b2d","Type":"ContainerStarted","Data":"325411fe81ead240e0b4f3e24e12365f2520a51984764f53694842b91230692f"} Sep 30 07:49:11 crc kubenswrapper[4760]: I0930 07:49:11.770367 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-56wgh" Sep 30 07:49:11 crc kubenswrapper[4760]: I0930 07:49:11.773548 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"23bad0af-c21e-4ba1-bc39-39c48f0fea56","Type":"ContainerStarted","Data":"9c0afca8ae3aeebf6add76b7fc525a9757fa0cf9f077654066903473a02b7265"} Sep 30 07:49:11 crc kubenswrapper[4760]: I0930 07:49:11.773673 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 30 07:49:11 crc kubenswrapper[4760]: I0930 07:49:11.775258 4760 generic.go:334] "Generic (PLEG): container finished" podID="8735bd7c-231f-47df-a404-b8cab84f0d7b" containerID="bb5c783116409821c8d036cd4624b907aaab1ab8f5f4d0871fa459f26babe319" exitCode=0 Sep 30 07:49:11 crc kubenswrapper[4760]: I0930 07:49:11.775342 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bwrv9" event={"ID":"8735bd7c-231f-47df-a404-b8cab84f0d7b","Type":"ContainerDied","Data":"bb5c783116409821c8d036cd4624b907aaab1ab8f5f4d0871fa459f26babe319"} Sep 30 07:49:11 crc kubenswrapper[4760]: I0930 07:49:11.805746 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-56wgh" podStartSLOduration=9.253061353 podStartE2EDuration="17.805725049s" podCreationTimestamp="2025-09-30 07:48:54 +0000 UTC" firstStartedPulling="2025-09-30 07:49:00.705348081 +0000 UTC m=+926.348254493" lastFinishedPulling="2025-09-30 07:49:09.258011777 +0000 UTC m=+934.900918189" observedRunningTime="2025-09-30 07:49:11.788071589 +0000 UTC m=+937.430978011" watchObservedRunningTime="2025-09-30 07:49:11.805725049 +0000 UTC m=+937.448631461" Sep 30 07:49:11 crc kubenswrapper[4760]: I0930 07:49:11.832123 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=13.565684683 podStartE2EDuration="22.832099441s" podCreationTimestamp="2025-09-30 07:48:49 +0000 UTC" firstStartedPulling="2025-09-30 07:49:00.733501609 +0000 UTC m=+926.376408021" lastFinishedPulling="2025-09-30 07:49:09.999916367 +0000 UTC m=+935.642822779" observedRunningTime="2025-09-30 07:49:11.824906828 +0000 UTC m=+937.467813240" watchObservedRunningTime="2025-09-30 07:49:11.832099441 +0000 UTC m=+937.475005863" Sep 30 07:49:12 crc kubenswrapper[4760]: I0930 07:49:12.785842 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"007888b6-d5c6-410a-955a-ed78adf759bd","Type":"ContainerStarted","Data":"872fb383f092e1b7cf31bdcc5e21d2fae1b56e52665f0a314c9ddddbad403eef"} Sep 30 07:49:12 crc kubenswrapper[4760]: I0930 07:49:12.790122 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bwrv9" event={"ID":"8735bd7c-231f-47df-a404-b8cab84f0d7b","Type":"ContainerStarted","Data":"9ef9a636ca475a64155ceeff66a788fc80cc5c0932a6e3cf44d9e970b2e4d4bd"} Sep 30 07:49:12 crc kubenswrapper[4760]: I0930 07:49:12.790150 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bwrv9" Sep 30 07:49:12 crc kubenswrapper[4760]: I0930 07:49:12.790160 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bwrv9" event={"ID":"8735bd7c-231f-47df-a404-b8cab84f0d7b","Type":"ContainerStarted","Data":"0713f2393e54189977759d0d7211f24596a85dd7c4ee790c0018d59de025520a"} Sep 30 07:49:12 crc kubenswrapper[4760]: I0930 07:49:12.790500 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bwrv9" Sep 30 07:49:13 crc kubenswrapper[4760]: I0930 07:49:13.087251 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f6db3f0-59be-48d1-a2a7-8302680b2f4c" path="/var/lib/kubelet/pods/4f6db3f0-59be-48d1-a2a7-8302680b2f4c/volumes" Sep 30 07:49:14 crc kubenswrapper[4760]: I0930 07:49:14.808351 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d2f7adda-d2ed-4c87-8e63-64e344155305","Type":"ContainerStarted","Data":"a720f4c84320384cd71444dac541b725ba296c75e696d4b127f066d5fec54aea"} Sep 30 07:49:14 crc kubenswrapper[4760]: I0930 07:49:14.810985 4760 generic.go:334] "Generic (PLEG): container finished" podID="641818bf-a81e-4654-a8f7-c8d06fbefc6c" containerID="4d4406eaa181c26bb7e05fb6765d4c05c8f631bab44cc919a27bffd7201e1be6" exitCode=0 Sep 30 07:49:14 crc kubenswrapper[4760]: I0930 07:49:14.811049 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"641818bf-a81e-4654-a8f7-c8d06fbefc6c","Type":"ContainerDied","Data":"4d4406eaa181c26bb7e05fb6765d4c05c8f631bab44cc919a27bffd7201e1be6"} Sep 30 07:49:14 crc kubenswrapper[4760]: I0930 07:49:14.815390 4760 generic.go:334] "Generic (PLEG): container finished" podID="894abb89-f647-4143-904c-88b5108982cd" containerID="d01c95707eb74c03adda0c5fdca708634feb4b5d0e2e7f8860a1044ea8810361" exitCode=0 Sep 30 07:49:14 crc kubenswrapper[4760]: I0930 07:49:14.815485 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"894abb89-f647-4143-904c-88b5108982cd","Type":"ContainerDied","Data":"d01c95707eb74c03adda0c5fdca708634feb4b5d0e2e7f8860a1044ea8810361"} Sep 30 07:49:14 crc kubenswrapper[4760]: I0930 07:49:14.819023 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d25b3b00-98d6-4bfc-8218-9ea7319e1c60","Type":"ContainerStarted","Data":"0573da35d3fcb2c1b288612ab067d20d156a7d4bea643ab0e20b7dbc334d2379"} Sep 30 07:49:14 crc kubenswrapper[4760]: I0930 07:49:14.837380 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=9.472020693 podStartE2EDuration="22.83735713s" podCreationTimestamp="2025-09-30 07:48:52 +0000 UTC" firstStartedPulling="2025-09-30 07:49:00.835984763 +0000 UTC m=+926.478891175" lastFinishedPulling="2025-09-30 07:49:14.2013212 +0000 UTC m=+939.844227612" observedRunningTime="2025-09-30 07:49:14.833740608 +0000 UTC m=+940.476647020" watchObservedRunningTime="2025-09-30 07:49:14.83735713 +0000 UTC m=+940.480263552" Sep 30 07:49:14 crc kubenswrapper[4760]: I0930 07:49:14.841669 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-bwrv9" podStartSLOduration=15.302063411 podStartE2EDuration="20.84165039s" podCreationTimestamp="2025-09-30 07:48:54 +0000 UTC" firstStartedPulling="2025-09-30 07:49:03.384860823 +0000 UTC m=+929.027767235" lastFinishedPulling="2025-09-30 07:49:08.924447802 +0000 UTC m=+934.567354214" observedRunningTime="2025-09-30 07:49:12.84064312 +0000 UTC m=+938.483549522" watchObservedRunningTime="2025-09-30 07:49:14.84165039 +0000 UTC m=+940.484556812" Sep 30 07:49:15 crc kubenswrapper[4760]: I0930 07:49:15.161367 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Sep 30 07:49:15 crc kubenswrapper[4760]: I0930 07:49:15.222448 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Sep 30 07:49:15 crc kubenswrapper[4760]: I0930 07:49:15.245633 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=6.982637305 podStartE2EDuration="20.245617582s" podCreationTimestamp="2025-09-30 07:48:55 +0000 UTC" firstStartedPulling="2025-09-30 07:49:00.947986799 +0000 UTC m=+926.590893211" lastFinishedPulling="2025-09-30 07:49:14.210967066 +0000 UTC m=+939.853873488" observedRunningTime="2025-09-30 07:49:14.917284118 +0000 UTC m=+940.560190550" watchObservedRunningTime="2025-09-30 07:49:15.245617582 +0000 UTC m=+940.888523994" Sep 30 07:49:15 crc kubenswrapper[4760]: I0930 07:49:15.252273 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Sep 30 07:49:15 crc kubenswrapper[4760]: I0930 07:49:15.299517 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Sep 30 07:49:15 crc kubenswrapper[4760]: I0930 07:49:15.832918 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"894abb89-f647-4143-904c-88b5108982cd","Type":"ContainerStarted","Data":"4564c02f729697043eda2140a93ee24ec0e2c4ce19cd3ea1252b2877ab4ce641"} Sep 30 07:49:15 crc kubenswrapper[4760]: I0930 07:49:15.836367 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"641818bf-a81e-4654-a8f7-c8d06fbefc6c","Type":"ContainerStarted","Data":"1a9171ce6e3bf21fa6b79bc0e2a81e319b0384074735dc5ba2dbb2862ebbf19a"} Sep 30 07:49:15 crc kubenswrapper[4760]: I0930 07:49:15.837257 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Sep 30 07:49:15 crc kubenswrapper[4760]: I0930 07:49:15.837370 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Sep 30 07:49:15 crc kubenswrapper[4760]: I0930 07:49:15.874818 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=20.858355864 podStartE2EDuration="29.874786497s" podCreationTimestamp="2025-09-30 07:48:46 +0000 UTC" firstStartedPulling="2025-09-30 07:49:00.707605379 +0000 UTC m=+926.350511781" lastFinishedPulling="2025-09-30 07:49:09.724035972 +0000 UTC m=+935.366942414" observedRunningTime="2025-09-30 07:49:15.858970093 +0000 UTC m=+941.501876545" watchObservedRunningTime="2025-09-30 07:49:15.874786497 +0000 UTC m=+941.517692949" Sep 30 07:49:15 crc kubenswrapper[4760]: I0930 07:49:15.888334 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=22.764635624 podStartE2EDuration="30.888291851s" podCreationTimestamp="2025-09-30 07:48:45 +0000 UTC" firstStartedPulling="2025-09-30 07:49:00.701693208 +0000 UTC m=+926.344599620" lastFinishedPulling="2025-09-30 07:49:08.825349425 +0000 UTC m=+934.468255847" observedRunningTime="2025-09-30 07:49:15.888286311 +0000 UTC m=+941.531192803" watchObservedRunningTime="2025-09-30 07:49:15.888291851 +0000 UTC m=+941.531198273" Sep 30 07:49:15 crc kubenswrapper[4760]: I0930 07:49:15.908582 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Sep 30 07:49:15 crc kubenswrapper[4760]: I0930 07:49:15.918413 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.175023 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-hg9bf"] Sep 30 07:49:16 crc kubenswrapper[4760]: E0930 07:49:16.175578 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f6db3f0-59be-48d1-a2a7-8302680b2f4c" containerName="init" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.175593 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f6db3f0-59be-48d1-a2a7-8302680b2f4c" containerName="init" Sep 30 07:49:16 crc kubenswrapper[4760]: E0930 07:49:16.175607 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f6db3f0-59be-48d1-a2a7-8302680b2f4c" containerName="dnsmasq-dns" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.175613 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f6db3f0-59be-48d1-a2a7-8302680b2f4c" containerName="dnsmasq-dns" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.175748 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f6db3f0-59be-48d1-a2a7-8302680b2f4c" containerName="dnsmasq-dns" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.176541 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-hg9bf" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.179664 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.229976 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-hg9bf"] Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.238450 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-k6hgt"] Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.239353 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-k6hgt" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.241660 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.253781 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-k6hgt"] Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.283744 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17fc6d13-b2dc-4732-abc0-99662d7ce38f-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-hg9bf\" (UID: \"17fc6d13-b2dc-4732-abc0-99662d7ce38f\") " pod="openstack/dnsmasq-dns-5bf47b49b7-hg9bf" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.283810 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79sqs\" (UniqueName: \"kubernetes.io/projected/83918139-1a35-439f-8f7c-cd46d6e21064-kube-api-access-79sqs\") pod \"ovn-controller-metrics-k6hgt\" (UID: \"83918139-1a35-439f-8f7c-cd46d6e21064\") " pod="openstack/ovn-controller-metrics-k6hgt" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.283865 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ncsh\" (UniqueName: \"kubernetes.io/projected/17fc6d13-b2dc-4732-abc0-99662d7ce38f-kube-api-access-9ncsh\") pod \"dnsmasq-dns-5bf47b49b7-hg9bf\" (UID: \"17fc6d13-b2dc-4732-abc0-99662d7ce38f\") " pod="openstack/dnsmasq-dns-5bf47b49b7-hg9bf" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.283893 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/83918139-1a35-439f-8f7c-cd46d6e21064-ovs-rundir\") pod \"ovn-controller-metrics-k6hgt\" (UID: \"83918139-1a35-439f-8f7c-cd46d6e21064\") " pod="openstack/ovn-controller-metrics-k6hgt" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.283922 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83918139-1a35-439f-8f7c-cd46d6e21064-combined-ca-bundle\") pod \"ovn-controller-metrics-k6hgt\" (UID: \"83918139-1a35-439f-8f7c-cd46d6e21064\") " pod="openstack/ovn-controller-metrics-k6hgt" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.283985 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17fc6d13-b2dc-4732-abc0-99662d7ce38f-config\") pod \"dnsmasq-dns-5bf47b49b7-hg9bf\" (UID: \"17fc6d13-b2dc-4732-abc0-99662d7ce38f\") " pod="openstack/dnsmasq-dns-5bf47b49b7-hg9bf" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.284013 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17fc6d13-b2dc-4732-abc0-99662d7ce38f-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-hg9bf\" (UID: \"17fc6d13-b2dc-4732-abc0-99662d7ce38f\") " pod="openstack/dnsmasq-dns-5bf47b49b7-hg9bf" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.284035 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/83918139-1a35-439f-8f7c-cd46d6e21064-ovn-rundir\") pod \"ovn-controller-metrics-k6hgt\" (UID: \"83918139-1a35-439f-8f7c-cd46d6e21064\") " pod="openstack/ovn-controller-metrics-k6hgt" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.284057 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83918139-1a35-439f-8f7c-cd46d6e21064-config\") pod \"ovn-controller-metrics-k6hgt\" (UID: \"83918139-1a35-439f-8f7c-cd46d6e21064\") " pod="openstack/ovn-controller-metrics-k6hgt" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.284093 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/83918139-1a35-439f-8f7c-cd46d6e21064-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-k6hgt\" (UID: \"83918139-1a35-439f-8f7c-cd46d6e21064\") " pod="openstack/ovn-controller-metrics-k6hgt" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.341604 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-hg9bf"] Sep 30 07:49:16 crc kubenswrapper[4760]: E0930 07:49:16.342153 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-9ncsh ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5bf47b49b7-hg9bf" podUID="17fc6d13-b2dc-4732-abc0-99662d7ce38f" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.359983 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-dr58s"] Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.361310 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-dr58s" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.365566 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.383656 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-dr58s"] Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.386247 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ncsh\" (UniqueName: \"kubernetes.io/projected/17fc6d13-b2dc-4732-abc0-99662d7ce38f-kube-api-access-9ncsh\") pod \"dnsmasq-dns-5bf47b49b7-hg9bf\" (UID: \"17fc6d13-b2dc-4732-abc0-99662d7ce38f\") " pod="openstack/dnsmasq-dns-5bf47b49b7-hg9bf" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.386281 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/83918139-1a35-439f-8f7c-cd46d6e21064-ovs-rundir\") pod \"ovn-controller-metrics-k6hgt\" (UID: \"83918139-1a35-439f-8f7c-cd46d6e21064\") " pod="openstack/ovn-controller-metrics-k6hgt" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.386315 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83918139-1a35-439f-8f7c-cd46d6e21064-combined-ca-bundle\") pod \"ovn-controller-metrics-k6hgt\" (UID: \"83918139-1a35-439f-8f7c-cd46d6e21064\") " pod="openstack/ovn-controller-metrics-k6hgt" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.386569 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/83918139-1a35-439f-8f7c-cd46d6e21064-ovs-rundir\") pod \"ovn-controller-metrics-k6hgt\" (UID: \"83918139-1a35-439f-8f7c-cd46d6e21064\") " pod="openstack/ovn-controller-metrics-k6hgt" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.386629 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17fc6d13-b2dc-4732-abc0-99662d7ce38f-config\") pod \"dnsmasq-dns-5bf47b49b7-hg9bf\" (UID: \"17fc6d13-b2dc-4732-abc0-99662d7ce38f\") " pod="openstack/dnsmasq-dns-5bf47b49b7-hg9bf" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.386655 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17fc6d13-b2dc-4732-abc0-99662d7ce38f-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-hg9bf\" (UID: \"17fc6d13-b2dc-4732-abc0-99662d7ce38f\") " pod="openstack/dnsmasq-dns-5bf47b49b7-hg9bf" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.386690 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-dns-svc\") pod \"dnsmasq-dns-8554648995-dr58s\" (UID: \"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad\") " pod="openstack/dnsmasq-dns-8554648995-dr58s" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.386710 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/83918139-1a35-439f-8f7c-cd46d6e21064-ovn-rundir\") pod \"ovn-controller-metrics-k6hgt\" (UID: \"83918139-1a35-439f-8f7c-cd46d6e21064\") " pod="openstack/ovn-controller-metrics-k6hgt" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.386727 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83918139-1a35-439f-8f7c-cd46d6e21064-config\") pod \"ovn-controller-metrics-k6hgt\" (UID: \"83918139-1a35-439f-8f7c-cd46d6e21064\") " pod="openstack/ovn-controller-metrics-k6hgt" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.386746 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-dr58s\" (UID: \"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad\") " pod="openstack/dnsmasq-dns-8554648995-dr58s" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.386789 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/83918139-1a35-439f-8f7c-cd46d6e21064-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-k6hgt\" (UID: \"83918139-1a35-439f-8f7c-cd46d6e21064\") " pod="openstack/ovn-controller-metrics-k6hgt" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.386817 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qmgh\" (UniqueName: \"kubernetes.io/projected/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-kube-api-access-4qmgh\") pod \"dnsmasq-dns-8554648995-dr58s\" (UID: \"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad\") " pod="openstack/dnsmasq-dns-8554648995-dr58s" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.386853 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-config\") pod \"dnsmasq-dns-8554648995-dr58s\" (UID: \"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad\") " pod="openstack/dnsmasq-dns-8554648995-dr58s" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.386921 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17fc6d13-b2dc-4732-abc0-99662d7ce38f-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-hg9bf\" (UID: \"17fc6d13-b2dc-4732-abc0-99662d7ce38f\") " pod="openstack/dnsmasq-dns-5bf47b49b7-hg9bf" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.387617 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83918139-1a35-439f-8f7c-cd46d6e21064-config\") pod \"ovn-controller-metrics-k6hgt\" (UID: \"83918139-1a35-439f-8f7c-cd46d6e21064\") " pod="openstack/ovn-controller-metrics-k6hgt" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.388213 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17fc6d13-b2dc-4732-abc0-99662d7ce38f-config\") pod \"dnsmasq-dns-5bf47b49b7-hg9bf\" (UID: \"17fc6d13-b2dc-4732-abc0-99662d7ce38f\") " pod="openstack/dnsmasq-dns-5bf47b49b7-hg9bf" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.393389 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17fc6d13-b2dc-4732-abc0-99662d7ce38f-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-hg9bf\" (UID: \"17fc6d13-b2dc-4732-abc0-99662d7ce38f\") " pod="openstack/dnsmasq-dns-5bf47b49b7-hg9bf" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.393619 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/83918139-1a35-439f-8f7c-cd46d6e21064-ovn-rundir\") pod \"ovn-controller-metrics-k6hgt\" (UID: \"83918139-1a35-439f-8f7c-cd46d6e21064\") " pod="openstack/ovn-controller-metrics-k6hgt" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.395118 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83918139-1a35-439f-8f7c-cd46d6e21064-combined-ca-bundle\") pod \"ovn-controller-metrics-k6hgt\" (UID: \"83918139-1a35-439f-8f7c-cd46d6e21064\") " pod="openstack/ovn-controller-metrics-k6hgt" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.400090 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79sqs\" (UniqueName: \"kubernetes.io/projected/83918139-1a35-439f-8f7c-cd46d6e21064-kube-api-access-79sqs\") pod \"ovn-controller-metrics-k6hgt\" (UID: \"83918139-1a35-439f-8f7c-cd46d6e21064\") " pod="openstack/ovn-controller-metrics-k6hgt" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.400221 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-dr58s\" (UID: \"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad\") " pod="openstack/dnsmasq-dns-8554648995-dr58s" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.403826 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/83918139-1a35-439f-8f7c-cd46d6e21064-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-k6hgt\" (UID: \"83918139-1a35-439f-8f7c-cd46d6e21064\") " pod="openstack/ovn-controller-metrics-k6hgt" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.405828 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17fc6d13-b2dc-4732-abc0-99662d7ce38f-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-hg9bf\" (UID: \"17fc6d13-b2dc-4732-abc0-99662d7ce38f\") " pod="openstack/dnsmasq-dns-5bf47b49b7-hg9bf" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.408335 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.409565 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.433403 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-b87j9" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.434031 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.434281 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.437824 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ncsh\" (UniqueName: \"kubernetes.io/projected/17fc6d13-b2dc-4732-abc0-99662d7ce38f-kube-api-access-9ncsh\") pod \"dnsmasq-dns-5bf47b49b7-hg9bf\" (UID: \"17fc6d13-b2dc-4732-abc0-99662d7ce38f\") " pod="openstack/dnsmasq-dns-5bf47b49b7-hg9bf" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.452031 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.476701 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79sqs\" (UniqueName: \"kubernetes.io/projected/83918139-1a35-439f-8f7c-cd46d6e21064-kube-api-access-79sqs\") pod \"ovn-controller-metrics-k6hgt\" (UID: \"83918139-1a35-439f-8f7c-cd46d6e21064\") " pod="openstack/ovn-controller-metrics-k6hgt" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.489136 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.506801 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/96ee48d1-c16e-4367-9159-0f9ddaf5e66a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"96ee48d1-c16e-4367-9159-0f9ddaf5e66a\") " pod="openstack/ovn-northd-0" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.506872 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vf26\" (UniqueName: \"kubernetes.io/projected/96ee48d1-c16e-4367-9159-0f9ddaf5e66a-kube-api-access-7vf26\") pod \"ovn-northd-0\" (UID: \"96ee48d1-c16e-4367-9159-0f9ddaf5e66a\") " pod="openstack/ovn-northd-0" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.506922 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-dr58s\" (UID: \"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad\") " pod="openstack/dnsmasq-dns-8554648995-dr58s" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.506963 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/96ee48d1-c16e-4367-9159-0f9ddaf5e66a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"96ee48d1-c16e-4367-9159-0f9ddaf5e66a\") " pod="openstack/ovn-northd-0" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.506984 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-dns-svc\") pod \"dnsmasq-dns-8554648995-dr58s\" (UID: \"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad\") " pod="openstack/dnsmasq-dns-8554648995-dr58s" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.507002 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96ee48d1-c16e-4367-9159-0f9ddaf5e66a-scripts\") pod \"ovn-northd-0\" (UID: \"96ee48d1-c16e-4367-9159-0f9ddaf5e66a\") " pod="openstack/ovn-northd-0" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.507016 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-dr58s\" (UID: \"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad\") " pod="openstack/dnsmasq-dns-8554648995-dr58s" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.507041 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96ee48d1-c16e-4367-9159-0f9ddaf5e66a-config\") pod \"ovn-northd-0\" (UID: \"96ee48d1-c16e-4367-9159-0f9ddaf5e66a\") " pod="openstack/ovn-northd-0" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.507064 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qmgh\" (UniqueName: \"kubernetes.io/projected/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-kube-api-access-4qmgh\") pod \"dnsmasq-dns-8554648995-dr58s\" (UID: \"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad\") " pod="openstack/dnsmasq-dns-8554648995-dr58s" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.507083 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/96ee48d1-c16e-4367-9159-0f9ddaf5e66a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"96ee48d1-c16e-4367-9159-0f9ddaf5e66a\") " pod="openstack/ovn-northd-0" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.507102 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-config\") pod \"dnsmasq-dns-8554648995-dr58s\" (UID: \"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad\") " pod="openstack/dnsmasq-dns-8554648995-dr58s" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.507117 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ee48d1-c16e-4367-9159-0f9ddaf5e66a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"96ee48d1-c16e-4367-9159-0f9ddaf5e66a\") " pod="openstack/ovn-northd-0" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.507958 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-dr58s\" (UID: \"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad\") " pod="openstack/dnsmasq-dns-8554648995-dr58s" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.508732 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-dr58s\" (UID: \"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad\") " pod="openstack/dnsmasq-dns-8554648995-dr58s" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.513886 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-dns-svc\") pod \"dnsmasq-dns-8554648995-dr58s\" (UID: \"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad\") " pod="openstack/dnsmasq-dns-8554648995-dr58s" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.513904 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-config\") pod \"dnsmasq-dns-8554648995-dr58s\" (UID: \"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad\") " pod="openstack/dnsmasq-dns-8554648995-dr58s" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.558097 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qmgh\" (UniqueName: \"kubernetes.io/projected/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-kube-api-access-4qmgh\") pod \"dnsmasq-dns-8554648995-dr58s\" (UID: \"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad\") " pod="openstack/dnsmasq-dns-8554648995-dr58s" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.560295 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-k6hgt" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.608094 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96ee48d1-c16e-4367-9159-0f9ddaf5e66a-scripts\") pod \"ovn-northd-0\" (UID: \"96ee48d1-c16e-4367-9159-0f9ddaf5e66a\") " pod="openstack/ovn-northd-0" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.608411 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96ee48d1-c16e-4367-9159-0f9ddaf5e66a-config\") pod \"ovn-northd-0\" (UID: \"96ee48d1-c16e-4367-9159-0f9ddaf5e66a\") " pod="openstack/ovn-northd-0" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.608448 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/96ee48d1-c16e-4367-9159-0f9ddaf5e66a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"96ee48d1-c16e-4367-9159-0f9ddaf5e66a\") " pod="openstack/ovn-northd-0" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.608463 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ee48d1-c16e-4367-9159-0f9ddaf5e66a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"96ee48d1-c16e-4367-9159-0f9ddaf5e66a\") " pod="openstack/ovn-northd-0" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.608492 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/96ee48d1-c16e-4367-9159-0f9ddaf5e66a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"96ee48d1-c16e-4367-9159-0f9ddaf5e66a\") " pod="openstack/ovn-northd-0" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.608530 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vf26\" (UniqueName: \"kubernetes.io/projected/96ee48d1-c16e-4367-9159-0f9ddaf5e66a-kube-api-access-7vf26\") pod \"ovn-northd-0\" (UID: \"96ee48d1-c16e-4367-9159-0f9ddaf5e66a\") " pod="openstack/ovn-northd-0" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.608588 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/96ee48d1-c16e-4367-9159-0f9ddaf5e66a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"96ee48d1-c16e-4367-9159-0f9ddaf5e66a\") " pod="openstack/ovn-northd-0" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.609082 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96ee48d1-c16e-4367-9159-0f9ddaf5e66a-scripts\") pod \"ovn-northd-0\" (UID: \"96ee48d1-c16e-4367-9159-0f9ddaf5e66a\") " pod="openstack/ovn-northd-0" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.609394 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/96ee48d1-c16e-4367-9159-0f9ddaf5e66a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"96ee48d1-c16e-4367-9159-0f9ddaf5e66a\") " pod="openstack/ovn-northd-0" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.609536 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96ee48d1-c16e-4367-9159-0f9ddaf5e66a-config\") pod \"ovn-northd-0\" (UID: \"96ee48d1-c16e-4367-9159-0f9ddaf5e66a\") " pod="openstack/ovn-northd-0" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.618133 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/96ee48d1-c16e-4367-9159-0f9ddaf5e66a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"96ee48d1-c16e-4367-9159-0f9ddaf5e66a\") " pod="openstack/ovn-northd-0" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.618814 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ee48d1-c16e-4367-9159-0f9ddaf5e66a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"96ee48d1-c16e-4367-9159-0f9ddaf5e66a\") " pod="openstack/ovn-northd-0" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.631953 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/96ee48d1-c16e-4367-9159-0f9ddaf5e66a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"96ee48d1-c16e-4367-9159-0f9ddaf5e66a\") " pod="openstack/ovn-northd-0" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.637215 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vf26\" (UniqueName: \"kubernetes.io/projected/96ee48d1-c16e-4367-9159-0f9ddaf5e66a-kube-api-access-7vf26\") pod \"ovn-northd-0\" (UID: \"96ee48d1-c16e-4367-9159-0f9ddaf5e66a\") " pod="openstack/ovn-northd-0" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.695099 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-dr58s" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.843548 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-hg9bf" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.861109 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-hg9bf" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.861341 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.912852 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17fc6d13-b2dc-4732-abc0-99662d7ce38f-config\") pod \"17fc6d13-b2dc-4732-abc0-99662d7ce38f\" (UID: \"17fc6d13-b2dc-4732-abc0-99662d7ce38f\") " Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.913172 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17fc6d13-b2dc-4732-abc0-99662d7ce38f-dns-svc\") pod \"17fc6d13-b2dc-4732-abc0-99662d7ce38f\" (UID: \"17fc6d13-b2dc-4732-abc0-99662d7ce38f\") " Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.913192 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ncsh\" (UniqueName: \"kubernetes.io/projected/17fc6d13-b2dc-4732-abc0-99662d7ce38f-kube-api-access-9ncsh\") pod \"17fc6d13-b2dc-4732-abc0-99662d7ce38f\" (UID: \"17fc6d13-b2dc-4732-abc0-99662d7ce38f\") " Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.913235 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17fc6d13-b2dc-4732-abc0-99662d7ce38f-ovsdbserver-nb\") pod \"17fc6d13-b2dc-4732-abc0-99662d7ce38f\" (UID: \"17fc6d13-b2dc-4732-abc0-99662d7ce38f\") " Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.913784 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17fc6d13-b2dc-4732-abc0-99662d7ce38f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "17fc6d13-b2dc-4732-abc0-99662d7ce38f" (UID: "17fc6d13-b2dc-4732-abc0-99662d7ce38f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.914180 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17fc6d13-b2dc-4732-abc0-99662d7ce38f-config" (OuterVolumeSpecName: "config") pod "17fc6d13-b2dc-4732-abc0-99662d7ce38f" (UID: "17fc6d13-b2dc-4732-abc0-99662d7ce38f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.914254 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17fc6d13-b2dc-4732-abc0-99662d7ce38f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "17fc6d13-b2dc-4732-abc0-99662d7ce38f" (UID: "17fc6d13-b2dc-4732-abc0-99662d7ce38f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.915619 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17fc6d13-b2dc-4732-abc0-99662d7ce38f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.915634 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17fc6d13-b2dc-4732-abc0-99662d7ce38f-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.915644 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17fc6d13-b2dc-4732-abc0-99662d7ce38f-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.918599 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17fc6d13-b2dc-4732-abc0-99662d7ce38f-kube-api-access-9ncsh" (OuterVolumeSpecName: "kube-api-access-9ncsh") pod "17fc6d13-b2dc-4732-abc0-99662d7ce38f" (UID: "17fc6d13-b2dc-4732-abc0-99662d7ce38f"). InnerVolumeSpecName "kube-api-access-9ncsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.991528 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Sep 30 07:49:16 crc kubenswrapper[4760]: I0930 07:49:16.991560 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Sep 30 07:49:17 crc kubenswrapper[4760]: I0930 07:49:17.017236 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ncsh\" (UniqueName: \"kubernetes.io/projected/17fc6d13-b2dc-4732-abc0-99662d7ce38f-kube-api-access-9ncsh\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:17 crc kubenswrapper[4760]: I0930 07:49:17.091146 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-k6hgt"] Sep 30 07:49:17 crc kubenswrapper[4760]: I0930 07:49:17.214548 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-dr58s"] Sep 30 07:49:17 crc kubenswrapper[4760]: I0930 07:49:17.362569 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 07:49:17 crc kubenswrapper[4760]: W0930 07:49:17.373491 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96ee48d1_c16e_4367_9159_0f9ddaf5e66a.slice/crio-ad9ee1f92205ba1096caa0dcaf8642a103abc2d2ecd3f25ca39d04c566b55a71 WatchSource:0}: Error finding container ad9ee1f92205ba1096caa0dcaf8642a103abc2d2ecd3f25ca39d04c566b55a71: Status 404 returned error can't find the container with id ad9ee1f92205ba1096caa0dcaf8642a103abc2d2ecd3f25ca39d04c566b55a71 Sep 30 07:49:17 crc kubenswrapper[4760]: I0930 07:49:17.690521 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Sep 30 07:49:17 crc kubenswrapper[4760]: I0930 07:49:17.690816 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Sep 30 07:49:17 crc kubenswrapper[4760]: I0930 07:49:17.863533 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-k6hgt" event={"ID":"83918139-1a35-439f-8f7c-cd46d6e21064","Type":"ContainerStarted","Data":"1328cb699a21dcc95ebfb30920c755c23f13628b2cf768c7c0feeba0dfca599f"} Sep 30 07:49:17 crc kubenswrapper[4760]: I0930 07:49:17.863583 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-k6hgt" event={"ID":"83918139-1a35-439f-8f7c-cd46d6e21064","Type":"ContainerStarted","Data":"b034dff85a9e68ef7fd78bc8c573e62952f46a623c0f9260f21e72bc7f4166b1"} Sep 30 07:49:17 crc kubenswrapper[4760]: I0930 07:49:17.867793 4760 generic.go:334] "Generic (PLEG): container finished" podID="6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad" containerID="48be93d0fea046f3d8ada9837f967d2d4455be9eb009f74eef391a84dce05a7a" exitCode=0 Sep 30 07:49:17 crc kubenswrapper[4760]: I0930 07:49:17.867856 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-dr58s" event={"ID":"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad","Type":"ContainerDied","Data":"48be93d0fea046f3d8ada9837f967d2d4455be9eb009f74eef391a84dce05a7a"} Sep 30 07:49:17 crc kubenswrapper[4760]: I0930 07:49:17.867877 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-dr58s" event={"ID":"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad","Type":"ContainerStarted","Data":"4ca6c3db63b637663caee10eed5308e42612df1611a7c4425b739e9e9ed18188"} Sep 30 07:49:17 crc kubenswrapper[4760]: I0930 07:49:17.871525 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"96ee48d1-c16e-4367-9159-0f9ddaf5e66a","Type":"ContainerStarted","Data":"ad9ee1f92205ba1096caa0dcaf8642a103abc2d2ecd3f25ca39d04c566b55a71"} Sep 30 07:49:17 crc kubenswrapper[4760]: I0930 07:49:17.871688 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-hg9bf" Sep 30 07:49:17 crc kubenswrapper[4760]: I0930 07:49:17.887167 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-k6hgt" podStartSLOduration=1.887139734 podStartE2EDuration="1.887139734s" podCreationTimestamp="2025-09-30 07:49:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:49:17.886613631 +0000 UTC m=+943.529520043" watchObservedRunningTime="2025-09-30 07:49:17.887139734 +0000 UTC m=+943.530046186" Sep 30 07:49:17 crc kubenswrapper[4760]: I0930 07:49:17.963161 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-hg9bf"] Sep 30 07:49:17 crc kubenswrapper[4760]: I0930 07:49:17.980877 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-hg9bf"] Sep 30 07:49:18 crc kubenswrapper[4760]: I0930 07:49:18.217041 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Sep 30 07:49:18 crc kubenswrapper[4760]: I0930 07:49:18.884770 4760 generic.go:334] "Generic (PLEG): container finished" podID="007888b6-d5c6-410a-955a-ed78adf759bd" containerID="872fb383f092e1b7cf31bdcc5e21d2fae1b56e52665f0a314c9ddddbad403eef" exitCode=0 Sep 30 07:49:18 crc kubenswrapper[4760]: I0930 07:49:18.885416 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"007888b6-d5c6-410a-955a-ed78adf759bd","Type":"ContainerDied","Data":"872fb383f092e1b7cf31bdcc5e21d2fae1b56e52665f0a314c9ddddbad403eef"} Sep 30 07:49:18 crc kubenswrapper[4760]: I0930 07:49:18.901148 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-dr58s" event={"ID":"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad","Type":"ContainerStarted","Data":"b2bb2029864022dfad325772a095e2328e30a0a3c9ffd58158203cedd2b8bef6"} Sep 30 07:49:18 crc kubenswrapper[4760]: I0930 07:49:18.974138 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-dr58s" podStartSLOduration=2.974118854 podStartE2EDuration="2.974118854s" podCreationTimestamp="2025-09-30 07:49:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:49:18.967548107 +0000 UTC m=+944.610454509" watchObservedRunningTime="2025-09-30 07:49:18.974118854 +0000 UTC m=+944.617025266" Sep 30 07:49:19 crc kubenswrapper[4760]: I0930 07:49:19.082762 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17fc6d13-b2dc-4732-abc0-99662d7ce38f" path="/var/lib/kubelet/pods/17fc6d13-b2dc-4732-abc0-99662d7ce38f/volumes" Sep 30 07:49:19 crc kubenswrapper[4760]: I0930 07:49:19.113202 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:49:19 crc kubenswrapper[4760]: I0930 07:49:19.113680 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:49:19 crc kubenswrapper[4760]: I0930 07:49:19.113737 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 07:49:19 crc kubenswrapper[4760]: I0930 07:49:19.114595 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9ca37c299871442165175723aa160e44f88bbaa555a2a12583d4390e53262571"} pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 07:49:19 crc kubenswrapper[4760]: I0930 07:49:19.114656 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" containerID="cri-o://9ca37c299871442165175723aa160e44f88bbaa555a2a12583d4390e53262571" gracePeriod=600 Sep 30 07:49:19 crc kubenswrapper[4760]: I0930 07:49:19.817729 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-dr58s"] Sep 30 07:49:19 crc kubenswrapper[4760]: I0930 07:49:19.842492 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-mgg5f"] Sep 30 07:49:19 crc kubenswrapper[4760]: I0930 07:49:19.843692 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" Sep 30 07:49:19 crc kubenswrapper[4760]: I0930 07:49:19.860440 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 30 07:49:19 crc kubenswrapper[4760]: I0930 07:49:19.871503 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-mgg5f"] Sep 30 07:49:19 crc kubenswrapper[4760]: I0930 07:49:19.913066 4760 generic.go:334] "Generic (PLEG): container finished" podID="7a9c8270-6964-4886-87d0-227b05b76da4" containerID="9ca37c299871442165175723aa160e44f88bbaa555a2a12583d4390e53262571" exitCode=0 Sep 30 07:49:19 crc kubenswrapper[4760]: I0930 07:49:19.914351 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerDied","Data":"9ca37c299871442165175723aa160e44f88bbaa555a2a12583d4390e53262571"} Sep 30 07:49:19 crc kubenswrapper[4760]: I0930 07:49:19.914488 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerStarted","Data":"e08631326d3db4f9e31ecd2756775d73d9783f49875cc3f66b5e516f36754f34"} Sep 30 07:49:19 crc kubenswrapper[4760]: I0930 07:49:19.914729 4760 scope.go:117] "RemoveContainer" containerID="2ee85f916ed74821bb70e759d7116d1ced5e1cd63215791b862f6d48359d7b6c" Sep 30 07:49:19 crc kubenswrapper[4760]: I0930 07:49:19.917473 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"96ee48d1-c16e-4367-9159-0f9ddaf5e66a","Type":"ContainerStarted","Data":"a41ee430afb4a4ae4b1c1a75b657ea479f873e5f16183d6a36c9f9507c4eb857"} Sep 30 07:49:19 crc kubenswrapper[4760]: I0930 07:49:19.917574 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"96ee48d1-c16e-4367-9159-0f9ddaf5e66a","Type":"ContainerStarted","Data":"07fcbdb99c9a0462c708a61cbdb278742a66f8f1505ad59ffac0f74114a42d6b"} Sep 30 07:49:19 crc kubenswrapper[4760]: I0930 07:49:19.917638 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-dr58s" Sep 30 07:49:19 crc kubenswrapper[4760]: I0930 07:49:19.917695 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Sep 30 07:49:19 crc kubenswrapper[4760]: I0930 07:49:19.980057 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fbea500-bbb8-4691-817f-e06a76aba29a-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-mgg5f\" (UID: \"2fbea500-bbb8-4691-817f-e06a76aba29a\") " pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" Sep 30 07:49:19 crc kubenswrapper[4760]: I0930 07:49:19.980355 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fbea500-bbb8-4691-817f-e06a76aba29a-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-mgg5f\" (UID: \"2fbea500-bbb8-4691-817f-e06a76aba29a\") " pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" Sep 30 07:49:19 crc kubenswrapper[4760]: I0930 07:49:19.980463 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fbea500-bbb8-4691-817f-e06a76aba29a-config\") pod \"dnsmasq-dns-b8fbc5445-mgg5f\" (UID: \"2fbea500-bbb8-4691-817f-e06a76aba29a\") " pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" Sep 30 07:49:19 crc kubenswrapper[4760]: I0930 07:49:19.980482 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fbea500-bbb8-4691-817f-e06a76aba29a-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-mgg5f\" (UID: \"2fbea500-bbb8-4691-817f-e06a76aba29a\") " pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" Sep 30 07:49:19 crc kubenswrapper[4760]: I0930 07:49:19.980520 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6rxv\" (UniqueName: \"kubernetes.io/projected/2fbea500-bbb8-4691-817f-e06a76aba29a-kube-api-access-r6rxv\") pod \"dnsmasq-dns-b8fbc5445-mgg5f\" (UID: \"2fbea500-bbb8-4691-817f-e06a76aba29a\") " pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" Sep 30 07:49:19 crc kubenswrapper[4760]: I0930 07:49:19.984351 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.792674206 podStartE2EDuration="3.984333846s" podCreationTimestamp="2025-09-30 07:49:16 +0000 UTC" firstStartedPulling="2025-09-30 07:49:17.375607259 +0000 UTC m=+943.018513671" lastFinishedPulling="2025-09-30 07:49:18.567266899 +0000 UTC m=+944.210173311" observedRunningTime="2025-09-30 07:49:19.977926822 +0000 UTC m=+945.620833234" watchObservedRunningTime="2025-09-30 07:49:19.984333846 +0000 UTC m=+945.627240258" Sep 30 07:49:20 crc kubenswrapper[4760]: I0930 07:49:20.081359 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6rxv\" (UniqueName: \"kubernetes.io/projected/2fbea500-bbb8-4691-817f-e06a76aba29a-kube-api-access-r6rxv\") pod \"dnsmasq-dns-b8fbc5445-mgg5f\" (UID: \"2fbea500-bbb8-4691-817f-e06a76aba29a\") " pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" Sep 30 07:49:20 crc kubenswrapper[4760]: I0930 07:49:20.081436 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fbea500-bbb8-4691-817f-e06a76aba29a-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-mgg5f\" (UID: \"2fbea500-bbb8-4691-817f-e06a76aba29a\") " pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" Sep 30 07:49:20 crc kubenswrapper[4760]: I0930 07:49:20.081521 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fbea500-bbb8-4691-817f-e06a76aba29a-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-mgg5f\" (UID: \"2fbea500-bbb8-4691-817f-e06a76aba29a\") " pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" Sep 30 07:49:20 crc kubenswrapper[4760]: I0930 07:49:20.081682 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fbea500-bbb8-4691-817f-e06a76aba29a-config\") pod \"dnsmasq-dns-b8fbc5445-mgg5f\" (UID: \"2fbea500-bbb8-4691-817f-e06a76aba29a\") " pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" Sep 30 07:49:20 crc kubenswrapper[4760]: I0930 07:49:20.081703 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fbea500-bbb8-4691-817f-e06a76aba29a-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-mgg5f\" (UID: \"2fbea500-bbb8-4691-817f-e06a76aba29a\") " pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" Sep 30 07:49:20 crc kubenswrapper[4760]: I0930 07:49:20.083368 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fbea500-bbb8-4691-817f-e06a76aba29a-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-mgg5f\" (UID: \"2fbea500-bbb8-4691-817f-e06a76aba29a\") " pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" Sep 30 07:49:20 crc kubenswrapper[4760]: I0930 07:49:20.084617 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fbea500-bbb8-4691-817f-e06a76aba29a-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-mgg5f\" (UID: \"2fbea500-bbb8-4691-817f-e06a76aba29a\") " pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" Sep 30 07:49:20 crc kubenswrapper[4760]: I0930 07:49:20.085519 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fbea500-bbb8-4691-817f-e06a76aba29a-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-mgg5f\" (UID: \"2fbea500-bbb8-4691-817f-e06a76aba29a\") " pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" Sep 30 07:49:20 crc kubenswrapper[4760]: I0930 07:49:20.085540 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fbea500-bbb8-4691-817f-e06a76aba29a-config\") pod \"dnsmasq-dns-b8fbc5445-mgg5f\" (UID: \"2fbea500-bbb8-4691-817f-e06a76aba29a\") " pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" Sep 30 07:49:20 crc kubenswrapper[4760]: I0930 07:49:20.118427 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6rxv\" (UniqueName: \"kubernetes.io/projected/2fbea500-bbb8-4691-817f-e06a76aba29a-kube-api-access-r6rxv\") pod \"dnsmasq-dns-b8fbc5445-mgg5f\" (UID: \"2fbea500-bbb8-4691-817f-e06a76aba29a\") " pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" Sep 30 07:49:20 crc kubenswrapper[4760]: I0930 07:49:20.171537 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" Sep 30 07:49:20 crc kubenswrapper[4760]: I0930 07:49:20.688142 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-mgg5f"] Sep 30 07:49:20 crc kubenswrapper[4760]: W0930 07:49:20.688366 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fbea500_bbb8_4691_817f_e06a76aba29a.slice/crio-41d35864dd4a35ded697e9ebbf23c62afd4fbb0ac262bc640c4e25a8817780cd WatchSource:0}: Error finding container 41d35864dd4a35ded697e9ebbf23c62afd4fbb0ac262bc640c4e25a8817780cd: Status 404 returned error can't find the container with id 41d35864dd4a35ded697e9ebbf23c62afd4fbb0ac262bc640c4e25a8817780cd Sep 30 07:49:20 crc kubenswrapper[4760]: I0930 07:49:20.926756 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" event={"ID":"2fbea500-bbb8-4691-817f-e06a76aba29a","Type":"ContainerStarted","Data":"41d35864dd4a35ded697e9ebbf23c62afd4fbb0ac262bc640c4e25a8817780cd"} Sep 30 07:49:20 crc kubenswrapper[4760]: I0930 07:49:20.927751 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-dr58s" podUID="6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad" containerName="dnsmasq-dns" containerID="cri-o://b2bb2029864022dfad325772a095e2328e30a0a3c9ffd58158203cedd2b8bef6" gracePeriod=10 Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.011511 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.028324 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.042577 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.066850 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-x47l2" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.067102 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.067240 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.067422 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.132941 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.208156 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glw7v\" (UniqueName: \"kubernetes.io/projected/db4f0b34-3c4a-4c78-b284-5959e91b00c0-kube-api-access-glw7v\") pod \"swift-storage-0\" (UID: \"db4f0b34-3c4a-4c78-b284-5959e91b00c0\") " pod="openstack/swift-storage-0" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.208205 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/db4f0b34-3c4a-4c78-b284-5959e91b00c0-etc-swift\") pod \"swift-storage-0\" (UID: \"db4f0b34-3c4a-4c78-b284-5959e91b00c0\") " pod="openstack/swift-storage-0" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.208352 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/db4f0b34-3c4a-4c78-b284-5959e91b00c0-lock\") pod \"swift-storage-0\" (UID: \"db4f0b34-3c4a-4c78-b284-5959e91b00c0\") " pod="openstack/swift-storage-0" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.208409 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"db4f0b34-3c4a-4c78-b284-5959e91b00c0\") " pod="openstack/swift-storage-0" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.208539 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/db4f0b34-3c4a-4c78-b284-5959e91b00c0-cache\") pod \"swift-storage-0\" (UID: \"db4f0b34-3c4a-4c78-b284-5959e91b00c0\") " pod="openstack/swift-storage-0" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.226455 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.309929 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"db4f0b34-3c4a-4c78-b284-5959e91b00c0\") " pod="openstack/swift-storage-0" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.310330 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"db4f0b34-3c4a-4c78-b284-5959e91b00c0\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/swift-storage-0" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.332097 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/db4f0b34-3c4a-4c78-b284-5959e91b00c0-cache\") pod \"swift-storage-0\" (UID: \"db4f0b34-3c4a-4c78-b284-5959e91b00c0\") " pod="openstack/swift-storage-0" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.332690 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/db4f0b34-3c4a-4c78-b284-5959e91b00c0-cache\") pod \"swift-storage-0\" (UID: \"db4f0b34-3c4a-4c78-b284-5959e91b00c0\") " pod="openstack/swift-storage-0" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.334464 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glw7v\" (UniqueName: \"kubernetes.io/projected/db4f0b34-3c4a-4c78-b284-5959e91b00c0-kube-api-access-glw7v\") pod \"swift-storage-0\" (UID: \"db4f0b34-3c4a-4c78-b284-5959e91b00c0\") " pod="openstack/swift-storage-0" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.334540 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/db4f0b34-3c4a-4c78-b284-5959e91b00c0-etc-swift\") pod \"swift-storage-0\" (UID: \"db4f0b34-3c4a-4c78-b284-5959e91b00c0\") " pod="openstack/swift-storage-0" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.334715 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/db4f0b34-3c4a-4c78-b284-5959e91b00c0-lock\") pod \"swift-storage-0\" (UID: \"db4f0b34-3c4a-4c78-b284-5959e91b00c0\") " pod="openstack/swift-storage-0" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.335152 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/db4f0b34-3c4a-4c78-b284-5959e91b00c0-lock\") pod \"swift-storage-0\" (UID: \"db4f0b34-3c4a-4c78-b284-5959e91b00c0\") " pod="openstack/swift-storage-0" Sep 30 07:49:21 crc kubenswrapper[4760]: E0930 07:49:21.335410 4760 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 07:49:21 crc kubenswrapper[4760]: E0930 07:49:21.335435 4760 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 07:49:21 crc kubenswrapper[4760]: E0930 07:49:21.335484 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/db4f0b34-3c4a-4c78-b284-5959e91b00c0-etc-swift podName:db4f0b34-3c4a-4c78-b284-5959e91b00c0 nodeName:}" failed. No retries permitted until 2025-09-30 07:49:21.835465122 +0000 UTC m=+947.478371534 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/db4f0b34-3c4a-4c78-b284-5959e91b00c0-etc-swift") pod "swift-storage-0" (UID: "db4f0b34-3c4a-4c78-b284-5959e91b00c0") : configmap "swift-ring-files" not found Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.344494 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"db4f0b34-3c4a-4c78-b284-5959e91b00c0\") " pod="openstack/swift-storage-0" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.365423 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glw7v\" (UniqueName: \"kubernetes.io/projected/db4f0b34-3c4a-4c78-b284-5959e91b00c0-kube-api-access-glw7v\") pod \"swift-storage-0\" (UID: \"db4f0b34-3c4a-4c78-b284-5959e91b00c0\") " pod="openstack/swift-storage-0" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.495373 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-4rxf2"] Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.496274 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-4rxf2" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.498025 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.498335 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.509905 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-4rxf2"] Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.511214 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.639562 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfaf0e86-3b68-4e5c-8caf-c60518a28016-scripts\") pod \"swift-ring-rebalance-4rxf2\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " pod="openstack/swift-ring-rebalance-4rxf2" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.639619 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfaf0e86-3b68-4e5c-8caf-c60518a28016-combined-ca-bundle\") pod \"swift-ring-rebalance-4rxf2\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " pod="openstack/swift-ring-rebalance-4rxf2" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.639664 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cfaf0e86-3b68-4e5c-8caf-c60518a28016-swiftconf\") pod \"swift-ring-rebalance-4rxf2\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " pod="openstack/swift-ring-rebalance-4rxf2" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.639879 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cfaf0e86-3b68-4e5c-8caf-c60518a28016-dispersionconf\") pod \"swift-ring-rebalance-4rxf2\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " pod="openstack/swift-ring-rebalance-4rxf2" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.639947 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cfaf0e86-3b68-4e5c-8caf-c60518a28016-etc-swift\") pod \"swift-ring-rebalance-4rxf2\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " pod="openstack/swift-ring-rebalance-4rxf2" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.640005 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf9t9\" (UniqueName: \"kubernetes.io/projected/cfaf0e86-3b68-4e5c-8caf-c60518a28016-kube-api-access-jf9t9\") pod \"swift-ring-rebalance-4rxf2\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " pod="openstack/swift-ring-rebalance-4rxf2" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.640110 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cfaf0e86-3b68-4e5c-8caf-c60518a28016-ring-data-devices\") pod \"swift-ring-rebalance-4rxf2\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " pod="openstack/swift-ring-rebalance-4rxf2" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.741461 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cfaf0e86-3b68-4e5c-8caf-c60518a28016-ring-data-devices\") pod \"swift-ring-rebalance-4rxf2\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " pod="openstack/swift-ring-rebalance-4rxf2" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.741865 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfaf0e86-3b68-4e5c-8caf-c60518a28016-scripts\") pod \"swift-ring-rebalance-4rxf2\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " pod="openstack/swift-ring-rebalance-4rxf2" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.741899 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfaf0e86-3b68-4e5c-8caf-c60518a28016-combined-ca-bundle\") pod \"swift-ring-rebalance-4rxf2\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " pod="openstack/swift-ring-rebalance-4rxf2" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.741943 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cfaf0e86-3b68-4e5c-8caf-c60518a28016-swiftconf\") pod \"swift-ring-rebalance-4rxf2\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " pod="openstack/swift-ring-rebalance-4rxf2" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.742030 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cfaf0e86-3b68-4e5c-8caf-c60518a28016-dispersionconf\") pod \"swift-ring-rebalance-4rxf2\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " pod="openstack/swift-ring-rebalance-4rxf2" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.742071 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cfaf0e86-3b68-4e5c-8caf-c60518a28016-etc-swift\") pod \"swift-ring-rebalance-4rxf2\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " pod="openstack/swift-ring-rebalance-4rxf2" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.742099 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf9t9\" (UniqueName: \"kubernetes.io/projected/cfaf0e86-3b68-4e5c-8caf-c60518a28016-kube-api-access-jf9t9\") pod \"swift-ring-rebalance-4rxf2\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " pod="openstack/swift-ring-rebalance-4rxf2" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.742443 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cfaf0e86-3b68-4e5c-8caf-c60518a28016-etc-swift\") pod \"swift-ring-rebalance-4rxf2\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " pod="openstack/swift-ring-rebalance-4rxf2" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.742535 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfaf0e86-3b68-4e5c-8caf-c60518a28016-scripts\") pod \"swift-ring-rebalance-4rxf2\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " pod="openstack/swift-ring-rebalance-4rxf2" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.742669 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cfaf0e86-3b68-4e5c-8caf-c60518a28016-ring-data-devices\") pod \"swift-ring-rebalance-4rxf2\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " pod="openstack/swift-ring-rebalance-4rxf2" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.745733 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cfaf0e86-3b68-4e5c-8caf-c60518a28016-swiftconf\") pod \"swift-ring-rebalance-4rxf2\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " pod="openstack/swift-ring-rebalance-4rxf2" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.746523 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfaf0e86-3b68-4e5c-8caf-c60518a28016-combined-ca-bundle\") pod \"swift-ring-rebalance-4rxf2\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " pod="openstack/swift-ring-rebalance-4rxf2" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.746631 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cfaf0e86-3b68-4e5c-8caf-c60518a28016-dispersionconf\") pod \"swift-ring-rebalance-4rxf2\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " pod="openstack/swift-ring-rebalance-4rxf2" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.761925 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf9t9\" (UniqueName: \"kubernetes.io/projected/cfaf0e86-3b68-4e5c-8caf-c60518a28016-kube-api-access-jf9t9\") pod \"swift-ring-rebalance-4rxf2\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " pod="openstack/swift-ring-rebalance-4rxf2" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.815813 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-4rxf2" Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.843233 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/db4f0b34-3c4a-4c78-b284-5959e91b00c0-etc-swift\") pod \"swift-storage-0\" (UID: \"db4f0b34-3c4a-4c78-b284-5959e91b00c0\") " pod="openstack/swift-storage-0" Sep 30 07:49:21 crc kubenswrapper[4760]: E0930 07:49:21.843492 4760 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 07:49:21 crc kubenswrapper[4760]: E0930 07:49:21.843518 4760 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 07:49:21 crc kubenswrapper[4760]: E0930 07:49:21.843571 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/db4f0b34-3c4a-4c78-b284-5959e91b00c0-etc-swift podName:db4f0b34-3c4a-4c78-b284-5959e91b00c0 nodeName:}" failed. No retries permitted until 2025-09-30 07:49:22.843551819 +0000 UTC m=+948.486458231 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/db4f0b34-3c4a-4c78-b284-5959e91b00c0-etc-swift") pod "swift-storage-0" (UID: "db4f0b34-3c4a-4c78-b284-5959e91b00c0") : configmap "swift-ring-files" not found Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.943144 4760 generic.go:334] "Generic (PLEG): container finished" podID="6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad" containerID="b2bb2029864022dfad325772a095e2328e30a0a3c9ffd58158203cedd2b8bef6" exitCode=0 Sep 30 07:49:21 crc kubenswrapper[4760]: I0930 07:49:21.944285 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-dr58s" event={"ID":"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad","Type":"ContainerDied","Data":"b2bb2029864022dfad325772a095e2328e30a0a3c9ffd58158203cedd2b8bef6"} Sep 30 07:49:22 crc kubenswrapper[4760]: I0930 07:49:22.281684 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-4rxf2"] Sep 30 07:49:22 crc kubenswrapper[4760]: W0930 07:49:22.289783 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfaf0e86_3b68_4e5c_8caf_c60518a28016.slice/crio-1a9fe96b94ff882a70c69f65fd9112134417c2162ed7da32f1f9d0442462a467 WatchSource:0}: Error finding container 1a9fe96b94ff882a70c69f65fd9112134417c2162ed7da32f1f9d0442462a467: Status 404 returned error can't find the container with id 1a9fe96b94ff882a70c69f65fd9112134417c2162ed7da32f1f9d0442462a467 Sep 30 07:49:22 crc kubenswrapper[4760]: I0930 07:49:22.863283 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/db4f0b34-3c4a-4c78-b284-5959e91b00c0-etc-swift\") pod \"swift-storage-0\" (UID: \"db4f0b34-3c4a-4c78-b284-5959e91b00c0\") " pod="openstack/swift-storage-0" Sep 30 07:49:22 crc kubenswrapper[4760]: E0930 07:49:22.863544 4760 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 07:49:22 crc kubenswrapper[4760]: E0930 07:49:22.863574 4760 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 07:49:22 crc kubenswrapper[4760]: E0930 07:49:22.863613 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/db4f0b34-3c4a-4c78-b284-5959e91b00c0-etc-swift podName:db4f0b34-3c4a-4c78-b284-5959e91b00c0 nodeName:}" failed. No retries permitted until 2025-09-30 07:49:24.863601113 +0000 UTC m=+950.506507525 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/db4f0b34-3c4a-4c78-b284-5959e91b00c0-etc-swift") pod "swift-storage-0" (UID: "db4f0b34-3c4a-4c78-b284-5959e91b00c0") : configmap "swift-ring-files" not found Sep 30 07:49:22 crc kubenswrapper[4760]: I0930 07:49:22.954102 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-4rxf2" event={"ID":"cfaf0e86-3b68-4e5c-8caf-c60518a28016","Type":"ContainerStarted","Data":"1a9fe96b94ff882a70c69f65fd9112134417c2162ed7da32f1f9d0442462a467"} Sep 30 07:49:23 crc kubenswrapper[4760]: I0930 07:49:23.853446 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Sep 30 07:49:23 crc kubenswrapper[4760]: I0930 07:49:23.954702 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Sep 30 07:49:23 crc kubenswrapper[4760]: I0930 07:49:23.965264 4760 generic.go:334] "Generic (PLEG): container finished" podID="2fbea500-bbb8-4691-817f-e06a76aba29a" containerID="d119665380541c10bfbc4d9ec60f2b77518528e25a6a8a983ef92659f7f1ec21" exitCode=0 Sep 30 07:49:23 crc kubenswrapper[4760]: I0930 07:49:23.965337 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" event={"ID":"2fbea500-bbb8-4691-817f-e06a76aba29a","Type":"ContainerDied","Data":"d119665380541c10bfbc4d9ec60f2b77518528e25a6a8a983ef92659f7f1ec21"} Sep 30 07:49:23 crc kubenswrapper[4760]: I0930 07:49:23.968859 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-dr58s" event={"ID":"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad","Type":"ContainerDied","Data":"4ca6c3db63b637663caee10eed5308e42612df1611a7c4425b739e9e9ed18188"} Sep 30 07:49:23 crc kubenswrapper[4760]: I0930 07:49:23.968914 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ca6c3db63b637663caee10eed5308e42612df1611a7c4425b739e9e9ed18188" Sep 30 07:49:24 crc kubenswrapper[4760]: I0930 07:49:24.068280 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-dr58s" Sep 30 07:49:24 crc kubenswrapper[4760]: I0930 07:49:24.189900 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qmgh\" (UniqueName: \"kubernetes.io/projected/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-kube-api-access-4qmgh\") pod \"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad\" (UID: \"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad\") " Sep 30 07:49:24 crc kubenswrapper[4760]: I0930 07:49:24.190133 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-ovsdbserver-sb\") pod \"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad\" (UID: \"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad\") " Sep 30 07:49:24 crc kubenswrapper[4760]: I0930 07:49:24.190182 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-config\") pod \"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad\" (UID: \"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad\") " Sep 30 07:49:24 crc kubenswrapper[4760]: I0930 07:49:24.190288 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-dns-svc\") pod \"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad\" (UID: \"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad\") " Sep 30 07:49:24 crc kubenswrapper[4760]: I0930 07:49:24.190332 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-ovsdbserver-nb\") pod \"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad\" (UID: \"6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad\") " Sep 30 07:49:24 crc kubenswrapper[4760]: I0930 07:49:24.208845 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-kube-api-access-4qmgh" (OuterVolumeSpecName: "kube-api-access-4qmgh") pod "6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad" (UID: "6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad"). InnerVolumeSpecName "kube-api-access-4qmgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:49:24 crc kubenswrapper[4760]: I0930 07:49:24.247622 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad" (UID: "6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:49:24 crc kubenswrapper[4760]: I0930 07:49:24.257194 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad" (UID: "6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:49:24 crc kubenswrapper[4760]: I0930 07:49:24.276623 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad" (UID: "6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:49:24 crc kubenswrapper[4760]: I0930 07:49:24.278743 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-config" (OuterVolumeSpecName: "config") pod "6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad" (UID: "6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:49:24 crc kubenswrapper[4760]: I0930 07:49:24.293525 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:24 crc kubenswrapper[4760]: I0930 07:49:24.293558 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:24 crc kubenswrapper[4760]: I0930 07:49:24.293575 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qmgh\" (UniqueName: \"kubernetes.io/projected/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-kube-api-access-4qmgh\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:24 crc kubenswrapper[4760]: I0930 07:49:24.293588 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:24 crc kubenswrapper[4760]: I0930 07:49:24.293599 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:24 crc kubenswrapper[4760]: I0930 07:49:24.902983 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/db4f0b34-3c4a-4c78-b284-5959e91b00c0-etc-swift\") pod \"swift-storage-0\" (UID: \"db4f0b34-3c4a-4c78-b284-5959e91b00c0\") " pod="openstack/swift-storage-0" Sep 30 07:49:24 crc kubenswrapper[4760]: E0930 07:49:24.903242 4760 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 07:49:24 crc kubenswrapper[4760]: E0930 07:49:24.903266 4760 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 07:49:24 crc kubenswrapper[4760]: E0930 07:49:24.903371 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/db4f0b34-3c4a-4c78-b284-5959e91b00c0-etc-swift podName:db4f0b34-3c4a-4c78-b284-5959e91b00c0 nodeName:}" failed. No retries permitted until 2025-09-30 07:49:28.903347219 +0000 UTC m=+954.546253631 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/db4f0b34-3c4a-4c78-b284-5959e91b00c0-etc-swift") pod "swift-storage-0" (UID: "db4f0b34-3c4a-4c78-b284-5959e91b00c0") : configmap "swift-ring-files" not found Sep 30 07:49:24 crc kubenswrapper[4760]: I0930 07:49:24.993136 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-dr58s" Sep 30 07:49:24 crc kubenswrapper[4760]: I0930 07:49:24.994402 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" event={"ID":"2fbea500-bbb8-4691-817f-e06a76aba29a","Type":"ContainerStarted","Data":"31b22899c626cee9f680e860115414b973b37b3a83e8fb751b0242cdfb1e1aa2"} Sep 30 07:49:24 crc kubenswrapper[4760]: I0930 07:49:24.994490 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" Sep 30 07:49:25 crc kubenswrapper[4760]: I0930 07:49:25.035106 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" podStartSLOduration=6.035089289 podStartE2EDuration="6.035089289s" podCreationTimestamp="2025-09-30 07:49:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:49:25.015495779 +0000 UTC m=+950.658402241" watchObservedRunningTime="2025-09-30 07:49:25.035089289 +0000 UTC m=+950.677995701" Sep 30 07:49:25 crc kubenswrapper[4760]: I0930 07:49:25.036563 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-dr58s"] Sep 30 07:49:25 crc kubenswrapper[4760]: I0930 07:49:25.042595 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-dr58s"] Sep 30 07:49:25 crc kubenswrapper[4760]: I0930 07:49:25.077370 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad" path="/var/lib/kubelet/pods/6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad/volumes" Sep 30 07:49:27 crc kubenswrapper[4760]: I0930 07:49:27.905967 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-fkqrw"] Sep 30 07:49:27 crc kubenswrapper[4760]: E0930 07:49:27.907053 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad" containerName="init" Sep 30 07:49:27 crc kubenswrapper[4760]: I0930 07:49:27.907074 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad" containerName="init" Sep 30 07:49:27 crc kubenswrapper[4760]: E0930 07:49:27.907144 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad" containerName="dnsmasq-dns" Sep 30 07:49:27 crc kubenswrapper[4760]: I0930 07:49:27.907154 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad" containerName="dnsmasq-dns" Sep 30 07:49:27 crc kubenswrapper[4760]: I0930 07:49:27.907491 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a2f3f1f-f213-44fe-b1ba-9f35efbe17ad" containerName="dnsmasq-dns" Sep 30 07:49:27 crc kubenswrapper[4760]: I0930 07:49:27.908538 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fkqrw" Sep 30 07:49:27 crc kubenswrapper[4760]: I0930 07:49:27.911494 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-fkqrw"] Sep 30 07:49:28 crc kubenswrapper[4760]: I0930 07:49:28.068246 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clctp\" (UniqueName: \"kubernetes.io/projected/b3bc4bc8-f5fa-4aa1-b24a-e742458d48ab-kube-api-access-clctp\") pod \"keystone-db-create-fkqrw\" (UID: \"b3bc4bc8-f5fa-4aa1-b24a-e742458d48ab\") " pod="openstack/keystone-db-create-fkqrw" Sep 30 07:49:28 crc kubenswrapper[4760]: I0930 07:49:28.123587 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-xsqh2"] Sep 30 07:49:28 crc kubenswrapper[4760]: I0930 07:49:28.125416 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xsqh2" Sep 30 07:49:28 crc kubenswrapper[4760]: I0930 07:49:28.137451 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xsqh2"] Sep 30 07:49:28 crc kubenswrapper[4760]: I0930 07:49:28.169572 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clctp\" (UniqueName: \"kubernetes.io/projected/b3bc4bc8-f5fa-4aa1-b24a-e742458d48ab-kube-api-access-clctp\") pod \"keystone-db-create-fkqrw\" (UID: \"b3bc4bc8-f5fa-4aa1-b24a-e742458d48ab\") " pod="openstack/keystone-db-create-fkqrw" Sep 30 07:49:28 crc kubenswrapper[4760]: I0930 07:49:28.188753 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clctp\" (UniqueName: \"kubernetes.io/projected/b3bc4bc8-f5fa-4aa1-b24a-e742458d48ab-kube-api-access-clctp\") pod \"keystone-db-create-fkqrw\" (UID: \"b3bc4bc8-f5fa-4aa1-b24a-e742458d48ab\") " pod="openstack/keystone-db-create-fkqrw" Sep 30 07:49:28 crc kubenswrapper[4760]: I0930 07:49:28.231710 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fkqrw" Sep 30 07:49:28 crc kubenswrapper[4760]: I0930 07:49:28.272481 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82r8p\" (UniqueName: \"kubernetes.io/projected/c941e149-5425-4ef2-9920-a7c6797230be-kube-api-access-82r8p\") pod \"placement-db-create-xsqh2\" (UID: \"c941e149-5425-4ef2-9920-a7c6797230be\") " pod="openstack/placement-db-create-xsqh2" Sep 30 07:49:28 crc kubenswrapper[4760]: I0930 07:49:28.375907 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82r8p\" (UniqueName: \"kubernetes.io/projected/c941e149-5425-4ef2-9920-a7c6797230be-kube-api-access-82r8p\") pod \"placement-db-create-xsqh2\" (UID: \"c941e149-5425-4ef2-9920-a7c6797230be\") " pod="openstack/placement-db-create-xsqh2" Sep 30 07:49:28 crc kubenswrapper[4760]: I0930 07:49:28.382294 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-zvwbw"] Sep 30 07:49:28 crc kubenswrapper[4760]: I0930 07:49:28.384087 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zvwbw" Sep 30 07:49:28 crc kubenswrapper[4760]: I0930 07:49:28.390361 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-zvwbw"] Sep 30 07:49:28 crc kubenswrapper[4760]: I0930 07:49:28.398152 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82r8p\" (UniqueName: \"kubernetes.io/projected/c941e149-5425-4ef2-9920-a7c6797230be-kube-api-access-82r8p\") pod \"placement-db-create-xsqh2\" (UID: \"c941e149-5425-4ef2-9920-a7c6797230be\") " pod="openstack/placement-db-create-xsqh2" Sep 30 07:49:28 crc kubenswrapper[4760]: I0930 07:49:28.452633 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xsqh2" Sep 30 07:49:28 crc kubenswrapper[4760]: I0930 07:49:28.477721 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2c8l\" (UniqueName: \"kubernetes.io/projected/60c66a34-554e-4655-8da8-e47e3e10a521-kube-api-access-d2c8l\") pod \"glance-db-create-zvwbw\" (UID: \"60c66a34-554e-4655-8da8-e47e3e10a521\") " pod="openstack/glance-db-create-zvwbw" Sep 30 07:49:28 crc kubenswrapper[4760]: I0930 07:49:28.580188 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2c8l\" (UniqueName: \"kubernetes.io/projected/60c66a34-554e-4655-8da8-e47e3e10a521-kube-api-access-d2c8l\") pod \"glance-db-create-zvwbw\" (UID: \"60c66a34-554e-4655-8da8-e47e3e10a521\") " pod="openstack/glance-db-create-zvwbw" Sep 30 07:49:28 crc kubenswrapper[4760]: I0930 07:49:28.602314 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2c8l\" (UniqueName: \"kubernetes.io/projected/60c66a34-554e-4655-8da8-e47e3e10a521-kube-api-access-d2c8l\") pod \"glance-db-create-zvwbw\" (UID: \"60c66a34-554e-4655-8da8-e47e3e10a521\") " pod="openstack/glance-db-create-zvwbw" Sep 30 07:49:28 crc kubenswrapper[4760]: I0930 07:49:28.773580 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zvwbw" Sep 30 07:49:28 crc kubenswrapper[4760]: I0930 07:49:28.801720 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-fkqrw"] Sep 30 07:49:28 crc kubenswrapper[4760]: W0930 07:49:28.810938 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3bc4bc8_f5fa_4aa1_b24a_e742458d48ab.slice/crio-812b762fd38c7f9e6a6f1792199e0863c35a5ecfd39719fda9d0afebc5314a83 WatchSource:0}: Error finding container 812b762fd38c7f9e6a6f1792199e0863c35a5ecfd39719fda9d0afebc5314a83: Status 404 returned error can't find the container with id 812b762fd38c7f9e6a6f1792199e0863c35a5ecfd39719fda9d0afebc5314a83 Sep 30 07:49:28 crc kubenswrapper[4760]: I0930 07:49:28.936636 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xsqh2"] Sep 30 07:49:28 crc kubenswrapper[4760]: W0930 07:49:28.954131 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc941e149_5425_4ef2_9920_a7c6797230be.slice/crio-0ad96a69fc89dfa21eabcf7351ac70ecfe97c04e4c95ea0daa274afa01efdc21 WatchSource:0}: Error finding container 0ad96a69fc89dfa21eabcf7351ac70ecfe97c04e4c95ea0daa274afa01efdc21: Status 404 returned error can't find the container with id 0ad96a69fc89dfa21eabcf7351ac70ecfe97c04e4c95ea0daa274afa01efdc21 Sep 30 07:49:28 crc kubenswrapper[4760]: I0930 07:49:28.987372 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/db4f0b34-3c4a-4c78-b284-5959e91b00c0-etc-swift\") pod \"swift-storage-0\" (UID: \"db4f0b34-3c4a-4c78-b284-5959e91b00c0\") " pod="openstack/swift-storage-0" Sep 30 07:49:28 crc kubenswrapper[4760]: E0930 07:49:28.987705 4760 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 07:49:28 crc kubenswrapper[4760]: E0930 07:49:28.987724 4760 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 07:49:28 crc kubenswrapper[4760]: E0930 07:49:28.987766 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/db4f0b34-3c4a-4c78-b284-5959e91b00c0-etc-swift podName:db4f0b34-3c4a-4c78-b284-5959e91b00c0 nodeName:}" failed. No retries permitted until 2025-09-30 07:49:36.987750738 +0000 UTC m=+962.630657150 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/db4f0b34-3c4a-4c78-b284-5959e91b00c0-etc-swift") pod "swift-storage-0" (UID: "db4f0b34-3c4a-4c78-b284-5959e91b00c0") : configmap "swift-ring-files" not found Sep 30 07:49:29 crc kubenswrapper[4760]: I0930 07:49:29.029605 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xsqh2" event={"ID":"c941e149-5425-4ef2-9920-a7c6797230be","Type":"ContainerStarted","Data":"0ad96a69fc89dfa21eabcf7351ac70ecfe97c04e4c95ea0daa274afa01efdc21"} Sep 30 07:49:29 crc kubenswrapper[4760]: I0930 07:49:29.030907 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fkqrw" event={"ID":"b3bc4bc8-f5fa-4aa1-b24a-e742458d48ab","Type":"ContainerStarted","Data":"8515edbb2c907995012d5ce121ed488b6c2903314ed7219188998e1e84ed4f54"} Sep 30 07:49:29 crc kubenswrapper[4760]: I0930 07:49:29.031022 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fkqrw" event={"ID":"b3bc4bc8-f5fa-4aa1-b24a-e742458d48ab","Type":"ContainerStarted","Data":"812b762fd38c7f9e6a6f1792199e0863c35a5ecfd39719fda9d0afebc5314a83"} Sep 30 07:49:29 crc kubenswrapper[4760]: I0930 07:49:29.033095 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"007888b6-d5c6-410a-955a-ed78adf759bd","Type":"ContainerStarted","Data":"50b776945742176dfd29a22633c991f8a4cdffc58f622d83331aa0413823ef59"} Sep 30 07:49:29 crc kubenswrapper[4760]: I0930 07:49:29.039513 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-4rxf2" event={"ID":"cfaf0e86-3b68-4e5c-8caf-c60518a28016","Type":"ContainerStarted","Data":"727446bebc692489557f3343f13a804eceef42682c5207b5e90b2de083680a8d"} Sep 30 07:49:29 crc kubenswrapper[4760]: I0930 07:49:29.050716 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-fkqrw" podStartSLOduration=2.050700413 podStartE2EDuration="2.050700413s" podCreationTimestamp="2025-09-30 07:49:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:49:29.047331947 +0000 UTC m=+954.690238379" watchObservedRunningTime="2025-09-30 07:49:29.050700413 +0000 UTC m=+954.693606815" Sep 30 07:49:29 crc kubenswrapper[4760]: I0930 07:49:29.067992 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-4rxf2" podStartSLOduration=2.025162592 podStartE2EDuration="8.067978234s" podCreationTimestamp="2025-09-30 07:49:21 +0000 UTC" firstStartedPulling="2025-09-30 07:49:22.291762859 +0000 UTC m=+947.934669271" lastFinishedPulling="2025-09-30 07:49:28.334578471 +0000 UTC m=+953.977484913" observedRunningTime="2025-09-30 07:49:29.062314079 +0000 UTC m=+954.705220491" watchObservedRunningTime="2025-09-30 07:49:29.067978234 +0000 UTC m=+954.710884636" Sep 30 07:49:29 crc kubenswrapper[4760]: I0930 07:49:29.248970 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-zvwbw"] Sep 30 07:49:29 crc kubenswrapper[4760]: W0930 07:49:29.314891 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60c66a34_554e_4655_8da8_e47e3e10a521.slice/crio-687a8fe89710a61caa65a28a011d5aeeb514712cd17891ff8d66c8f44957dd0f WatchSource:0}: Error finding container 687a8fe89710a61caa65a28a011d5aeeb514712cd17891ff8d66c8f44957dd0f: Status 404 returned error can't find the container with id 687a8fe89710a61caa65a28a011d5aeeb514712cd17891ff8d66c8f44957dd0f Sep 30 07:49:29 crc kubenswrapper[4760]: I0930 07:49:29.857374 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-l747q"] Sep 30 07:49:29 crc kubenswrapper[4760]: I0930 07:49:29.860263 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-l747q" Sep 30 07:49:29 crc kubenswrapper[4760]: I0930 07:49:29.871945 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-l747q"] Sep 30 07:49:30 crc kubenswrapper[4760]: I0930 07:49:30.005174 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whgw2\" (UniqueName: \"kubernetes.io/projected/636599f2-9be2-4380-b923-0f3b3b77e39b-kube-api-access-whgw2\") pod \"watcher-db-create-l747q\" (UID: \"636599f2-9be2-4380-b923-0f3b3b77e39b\") " pod="openstack/watcher-db-create-l747q" Sep 30 07:49:30 crc kubenswrapper[4760]: I0930 07:49:30.056900 4760 generic.go:334] "Generic (PLEG): container finished" podID="c941e149-5425-4ef2-9920-a7c6797230be" containerID="163915f4279ad5b742fff9a6114f00bc851f93640cacd34cc00b2fb3c9589df6" exitCode=0 Sep 30 07:49:30 crc kubenswrapper[4760]: I0930 07:49:30.056993 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xsqh2" event={"ID":"c941e149-5425-4ef2-9920-a7c6797230be","Type":"ContainerDied","Data":"163915f4279ad5b742fff9a6114f00bc851f93640cacd34cc00b2fb3c9589df6"} Sep 30 07:49:30 crc kubenswrapper[4760]: I0930 07:49:30.066144 4760 generic.go:334] "Generic (PLEG): container finished" podID="60c66a34-554e-4655-8da8-e47e3e10a521" containerID="981e42cdb61366b1cc2bac9302f14db150bd5d0c216a3c2c91bc6070ebf0c7b5" exitCode=0 Sep 30 07:49:30 crc kubenswrapper[4760]: I0930 07:49:30.066234 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-zvwbw" event={"ID":"60c66a34-554e-4655-8da8-e47e3e10a521","Type":"ContainerDied","Data":"981e42cdb61366b1cc2bac9302f14db150bd5d0c216a3c2c91bc6070ebf0c7b5"} Sep 30 07:49:30 crc kubenswrapper[4760]: I0930 07:49:30.066268 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-zvwbw" event={"ID":"60c66a34-554e-4655-8da8-e47e3e10a521","Type":"ContainerStarted","Data":"687a8fe89710a61caa65a28a011d5aeeb514712cd17891ff8d66c8f44957dd0f"} Sep 30 07:49:30 crc kubenswrapper[4760]: I0930 07:49:30.075129 4760 generic.go:334] "Generic (PLEG): container finished" podID="b3bc4bc8-f5fa-4aa1-b24a-e742458d48ab" containerID="8515edbb2c907995012d5ce121ed488b6c2903314ed7219188998e1e84ed4f54" exitCode=0 Sep 30 07:49:30 crc kubenswrapper[4760]: I0930 07:49:30.075624 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fkqrw" event={"ID":"b3bc4bc8-f5fa-4aa1-b24a-e742458d48ab","Type":"ContainerDied","Data":"8515edbb2c907995012d5ce121ed488b6c2903314ed7219188998e1e84ed4f54"} Sep 30 07:49:30 crc kubenswrapper[4760]: I0930 07:49:30.107793 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whgw2\" (UniqueName: \"kubernetes.io/projected/636599f2-9be2-4380-b923-0f3b3b77e39b-kube-api-access-whgw2\") pod \"watcher-db-create-l747q\" (UID: \"636599f2-9be2-4380-b923-0f3b3b77e39b\") " pod="openstack/watcher-db-create-l747q" Sep 30 07:49:30 crc kubenswrapper[4760]: I0930 07:49:30.145431 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whgw2\" (UniqueName: \"kubernetes.io/projected/636599f2-9be2-4380-b923-0f3b3b77e39b-kube-api-access-whgw2\") pod \"watcher-db-create-l747q\" (UID: \"636599f2-9be2-4380-b923-0f3b3b77e39b\") " pod="openstack/watcher-db-create-l747q" Sep 30 07:49:30 crc kubenswrapper[4760]: I0930 07:49:30.173278 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" Sep 30 07:49:30 crc kubenswrapper[4760]: I0930 07:49:30.200254 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-l747q" Sep 30 07:49:30 crc kubenswrapper[4760]: I0930 07:49:30.231456 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9fxn6"] Sep 30 07:49:30 crc kubenswrapper[4760]: I0930 07:49:30.231728 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-9fxn6" podUID="d6258f1d-30e6-4da1-b567-e37509041f6f" containerName="dnsmasq-dns" containerID="cri-o://ff7c0164e34617614a75b6a09d049e542f7bbaf9f3e1fc30287f9c2bfb0ea044" gracePeriod=10 Sep 30 07:49:30 crc kubenswrapper[4760]: I0930 07:49:30.849987 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-l747q"] Sep 30 07:49:30 crc kubenswrapper[4760]: W0930 07:49:30.856445 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod636599f2_9be2_4380_b923_0f3b3b77e39b.slice/crio-f6cba2d995d588dc06e0a2b41306e29982607e57354b5979d4de7f587b575e99 WatchSource:0}: Error finding container f6cba2d995d588dc06e0a2b41306e29982607e57354b5979d4de7f587b575e99: Status 404 returned error can't find the container with id f6cba2d995d588dc06e0a2b41306e29982607e57354b5979d4de7f587b575e99 Sep 30 07:49:30 crc kubenswrapper[4760]: I0930 07:49:30.934154 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9fxn6" Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.047189 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6258f1d-30e6-4da1-b567-e37509041f6f-config\") pod \"d6258f1d-30e6-4da1-b567-e37509041f6f\" (UID: \"d6258f1d-30e6-4da1-b567-e37509041f6f\") " Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.047254 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6258f1d-30e6-4da1-b567-e37509041f6f-dns-svc\") pod \"d6258f1d-30e6-4da1-b567-e37509041f6f\" (UID: \"d6258f1d-30e6-4da1-b567-e37509041f6f\") " Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.047332 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhlxx\" (UniqueName: \"kubernetes.io/projected/d6258f1d-30e6-4da1-b567-e37509041f6f-kube-api-access-lhlxx\") pod \"d6258f1d-30e6-4da1-b567-e37509041f6f\" (UID: \"d6258f1d-30e6-4da1-b567-e37509041f6f\") " Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.080868 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6258f1d-30e6-4da1-b567-e37509041f6f-kube-api-access-lhlxx" (OuterVolumeSpecName: "kube-api-access-lhlxx") pod "d6258f1d-30e6-4da1-b567-e37509041f6f" (UID: "d6258f1d-30e6-4da1-b567-e37509041f6f"). InnerVolumeSpecName "kube-api-access-lhlxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.107872 4760 generic.go:334] "Generic (PLEG): container finished" podID="d6258f1d-30e6-4da1-b567-e37509041f6f" containerID="ff7c0164e34617614a75b6a09d049e542f7bbaf9f3e1fc30287f9c2bfb0ea044" exitCode=0 Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.108012 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9fxn6" event={"ID":"d6258f1d-30e6-4da1-b567-e37509041f6f","Type":"ContainerDied","Data":"ff7c0164e34617614a75b6a09d049e542f7bbaf9f3e1fc30287f9c2bfb0ea044"} Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.108016 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9fxn6" Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.108066 4760 scope.go:117] "RemoveContainer" containerID="ff7c0164e34617614a75b6a09d049e542f7bbaf9f3e1fc30287f9c2bfb0ea044" Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.108049 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9fxn6" event={"ID":"d6258f1d-30e6-4da1-b567-e37509041f6f","Type":"ContainerDied","Data":"cd9fab081c150991be83f649bf7d0c1dde39f44100fbc7d5e5eaa469eed638c8"} Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.109641 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-l747q" event={"ID":"636599f2-9be2-4380-b923-0f3b3b77e39b","Type":"ContainerStarted","Data":"f6cba2d995d588dc06e0a2b41306e29982607e57354b5979d4de7f587b575e99"} Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.141822 4760 scope.go:117] "RemoveContainer" containerID="68e5cf3102c89318e7520f6ebb8202a77c277703d76133534cc202e1d139e965" Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.142696 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6258f1d-30e6-4da1-b567-e37509041f6f-config" (OuterVolumeSpecName: "config") pod "d6258f1d-30e6-4da1-b567-e37509041f6f" (UID: "d6258f1d-30e6-4da1-b567-e37509041f6f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.144499 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6258f1d-30e6-4da1-b567-e37509041f6f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d6258f1d-30e6-4da1-b567-e37509041f6f" (UID: "d6258f1d-30e6-4da1-b567-e37509041f6f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.148936 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6258f1d-30e6-4da1-b567-e37509041f6f-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.149132 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6258f1d-30e6-4da1-b567-e37509041f6f-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.149257 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhlxx\" (UniqueName: \"kubernetes.io/projected/d6258f1d-30e6-4da1-b567-e37509041f6f-kube-api-access-lhlxx\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.171447 4760 scope.go:117] "RemoveContainer" containerID="ff7c0164e34617614a75b6a09d049e542f7bbaf9f3e1fc30287f9c2bfb0ea044" Sep 30 07:49:31 crc kubenswrapper[4760]: E0930 07:49:31.171958 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff7c0164e34617614a75b6a09d049e542f7bbaf9f3e1fc30287f9c2bfb0ea044\": container with ID starting with ff7c0164e34617614a75b6a09d049e542f7bbaf9f3e1fc30287f9c2bfb0ea044 not found: ID does not exist" containerID="ff7c0164e34617614a75b6a09d049e542f7bbaf9f3e1fc30287f9c2bfb0ea044" Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.171994 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff7c0164e34617614a75b6a09d049e542f7bbaf9f3e1fc30287f9c2bfb0ea044"} err="failed to get container status \"ff7c0164e34617614a75b6a09d049e542f7bbaf9f3e1fc30287f9c2bfb0ea044\": rpc error: code = NotFound desc = could not find container \"ff7c0164e34617614a75b6a09d049e542f7bbaf9f3e1fc30287f9c2bfb0ea044\": container with ID starting with ff7c0164e34617614a75b6a09d049e542f7bbaf9f3e1fc30287f9c2bfb0ea044 not found: ID does not exist" Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.172021 4760 scope.go:117] "RemoveContainer" containerID="68e5cf3102c89318e7520f6ebb8202a77c277703d76133534cc202e1d139e965" Sep 30 07:49:31 crc kubenswrapper[4760]: E0930 07:49:31.172971 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68e5cf3102c89318e7520f6ebb8202a77c277703d76133534cc202e1d139e965\": container with ID starting with 68e5cf3102c89318e7520f6ebb8202a77c277703d76133534cc202e1d139e965 not found: ID does not exist" containerID="68e5cf3102c89318e7520f6ebb8202a77c277703d76133534cc202e1d139e965" Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.173036 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68e5cf3102c89318e7520f6ebb8202a77c277703d76133534cc202e1d139e965"} err="failed to get container status \"68e5cf3102c89318e7520f6ebb8202a77c277703d76133534cc202e1d139e965\": rpc error: code = NotFound desc = could not find container \"68e5cf3102c89318e7520f6ebb8202a77c277703d76133534cc202e1d139e965\": container with ID starting with 68e5cf3102c89318e7520f6ebb8202a77c277703d76133534cc202e1d139e965 not found: ID does not exist" Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.454401 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9fxn6"] Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.477654 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9fxn6"] Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.622316 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zvwbw" Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.701157 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fkqrw" Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.703222 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xsqh2" Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.760171 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2c8l\" (UniqueName: \"kubernetes.io/projected/60c66a34-554e-4655-8da8-e47e3e10a521-kube-api-access-d2c8l\") pod \"60c66a34-554e-4655-8da8-e47e3e10a521\" (UID: \"60c66a34-554e-4655-8da8-e47e3e10a521\") " Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.765844 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60c66a34-554e-4655-8da8-e47e3e10a521-kube-api-access-d2c8l" (OuterVolumeSpecName: "kube-api-access-d2c8l") pod "60c66a34-554e-4655-8da8-e47e3e10a521" (UID: "60c66a34-554e-4655-8da8-e47e3e10a521"). InnerVolumeSpecName "kube-api-access-d2c8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.862331 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clctp\" (UniqueName: \"kubernetes.io/projected/b3bc4bc8-f5fa-4aa1-b24a-e742458d48ab-kube-api-access-clctp\") pod \"b3bc4bc8-f5fa-4aa1-b24a-e742458d48ab\" (UID: \"b3bc4bc8-f5fa-4aa1-b24a-e742458d48ab\") " Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.862406 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82r8p\" (UniqueName: \"kubernetes.io/projected/c941e149-5425-4ef2-9920-a7c6797230be-kube-api-access-82r8p\") pod \"c941e149-5425-4ef2-9920-a7c6797230be\" (UID: \"c941e149-5425-4ef2-9920-a7c6797230be\") " Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.862922 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2c8l\" (UniqueName: \"kubernetes.io/projected/60c66a34-554e-4655-8da8-e47e3e10a521-kube-api-access-d2c8l\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.865418 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3bc4bc8-f5fa-4aa1-b24a-e742458d48ab-kube-api-access-clctp" (OuterVolumeSpecName: "kube-api-access-clctp") pod "b3bc4bc8-f5fa-4aa1-b24a-e742458d48ab" (UID: "b3bc4bc8-f5fa-4aa1-b24a-e742458d48ab"). InnerVolumeSpecName "kube-api-access-clctp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.866045 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c941e149-5425-4ef2-9920-a7c6797230be-kube-api-access-82r8p" (OuterVolumeSpecName: "kube-api-access-82r8p") pod "c941e149-5425-4ef2-9920-a7c6797230be" (UID: "c941e149-5425-4ef2-9920-a7c6797230be"). InnerVolumeSpecName "kube-api-access-82r8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.930127 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.964537 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82r8p\" (UniqueName: \"kubernetes.io/projected/c941e149-5425-4ef2-9920-a7c6797230be-kube-api-access-82r8p\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:31 crc kubenswrapper[4760]: I0930 07:49:31.964568 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clctp\" (UniqueName: \"kubernetes.io/projected/b3bc4bc8-f5fa-4aa1-b24a-e742458d48ab-kube-api-access-clctp\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:32 crc kubenswrapper[4760]: I0930 07:49:32.118800 4760 generic.go:334] "Generic (PLEG): container finished" podID="636599f2-9be2-4380-b923-0f3b3b77e39b" containerID="1684d34342baae3b11898ef8dd238f02ca25d71d91750bcf9b03335d830dfc10" exitCode=0 Sep 30 07:49:32 crc kubenswrapper[4760]: I0930 07:49:32.118862 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-l747q" event={"ID":"636599f2-9be2-4380-b923-0f3b3b77e39b","Type":"ContainerDied","Data":"1684d34342baae3b11898ef8dd238f02ca25d71d91750bcf9b03335d830dfc10"} Sep 30 07:49:32 crc kubenswrapper[4760]: I0930 07:49:32.124599 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"007888b6-d5c6-410a-955a-ed78adf759bd","Type":"ContainerStarted","Data":"a2cea33f1b45cef9414324dcb379d8628760022a434c051e678c5552ac3a7b76"} Sep 30 07:49:32 crc kubenswrapper[4760]: I0930 07:49:32.126384 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xsqh2" event={"ID":"c941e149-5425-4ef2-9920-a7c6797230be","Type":"ContainerDied","Data":"0ad96a69fc89dfa21eabcf7351ac70ecfe97c04e4c95ea0daa274afa01efdc21"} Sep 30 07:49:32 crc kubenswrapper[4760]: I0930 07:49:32.126412 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ad96a69fc89dfa21eabcf7351ac70ecfe97c04e4c95ea0daa274afa01efdc21" Sep 30 07:49:32 crc kubenswrapper[4760]: I0930 07:49:32.126462 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xsqh2" Sep 30 07:49:32 crc kubenswrapper[4760]: I0930 07:49:32.134153 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-zvwbw" event={"ID":"60c66a34-554e-4655-8da8-e47e3e10a521","Type":"ContainerDied","Data":"687a8fe89710a61caa65a28a011d5aeeb514712cd17891ff8d66c8f44957dd0f"} Sep 30 07:49:32 crc kubenswrapper[4760]: I0930 07:49:32.134198 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="687a8fe89710a61caa65a28a011d5aeeb514712cd17891ff8d66c8f44957dd0f" Sep 30 07:49:32 crc kubenswrapper[4760]: I0930 07:49:32.134213 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zvwbw" Sep 30 07:49:32 crc kubenswrapper[4760]: I0930 07:49:32.136778 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fkqrw" event={"ID":"b3bc4bc8-f5fa-4aa1-b24a-e742458d48ab","Type":"ContainerDied","Data":"812b762fd38c7f9e6a6f1792199e0863c35a5ecfd39719fda9d0afebc5314a83"} Sep 30 07:49:32 crc kubenswrapper[4760]: I0930 07:49:32.136801 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="812b762fd38c7f9e6a6f1792199e0863c35a5ecfd39719fda9d0afebc5314a83" Sep 30 07:49:32 crc kubenswrapper[4760]: I0930 07:49:32.136860 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fkqrw" Sep 30 07:49:33 crc kubenswrapper[4760]: I0930 07:49:33.081024 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6258f1d-30e6-4da1-b567-e37509041f6f" path="/var/lib/kubelet/pods/d6258f1d-30e6-4da1-b567-e37509041f6f/volumes" Sep 30 07:49:34 crc kubenswrapper[4760]: I0930 07:49:34.277552 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-l747q" Sep 30 07:49:34 crc kubenswrapper[4760]: I0930 07:49:34.423164 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whgw2\" (UniqueName: \"kubernetes.io/projected/636599f2-9be2-4380-b923-0f3b3b77e39b-kube-api-access-whgw2\") pod \"636599f2-9be2-4380-b923-0f3b3b77e39b\" (UID: \"636599f2-9be2-4380-b923-0f3b3b77e39b\") " Sep 30 07:49:34 crc kubenswrapper[4760]: I0930 07:49:34.429169 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/636599f2-9be2-4380-b923-0f3b3b77e39b-kube-api-access-whgw2" (OuterVolumeSpecName: "kube-api-access-whgw2") pod "636599f2-9be2-4380-b923-0f3b3b77e39b" (UID: "636599f2-9be2-4380-b923-0f3b3b77e39b"). InnerVolumeSpecName "kube-api-access-whgw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:49:34 crc kubenswrapper[4760]: I0930 07:49:34.524970 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whgw2\" (UniqueName: \"kubernetes.io/projected/636599f2-9be2-4380-b923-0f3b3b77e39b-kube-api-access-whgw2\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:35 crc kubenswrapper[4760]: I0930 07:49:35.164155 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-l747q" event={"ID":"636599f2-9be2-4380-b923-0f3b3b77e39b","Type":"ContainerDied","Data":"f6cba2d995d588dc06e0a2b41306e29982607e57354b5979d4de7f587b575e99"} Sep 30 07:49:35 crc kubenswrapper[4760]: I0930 07:49:35.164683 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6cba2d995d588dc06e0a2b41306e29982607e57354b5979d4de7f587b575e99" Sep 30 07:49:35 crc kubenswrapper[4760]: I0930 07:49:35.164177 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-l747q" Sep 30 07:49:35 crc kubenswrapper[4760]: I0930 07:49:35.167952 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"007888b6-d5c6-410a-955a-ed78adf759bd","Type":"ContainerStarted","Data":"c903e52356aaa141261bf2b36b8b1e590d0f7e4e8cd06608287d1235be886d86"} Sep 30 07:49:35 crc kubenswrapper[4760]: I0930 07:49:35.188432 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=12.6770291 podStartE2EDuration="46.188414144s" podCreationTimestamp="2025-09-30 07:48:49 +0000 UTC" firstStartedPulling="2025-09-30 07:49:00.778516237 +0000 UTC m=+926.421422649" lastFinishedPulling="2025-09-30 07:49:34.289901281 +0000 UTC m=+959.932807693" observedRunningTime="2025-09-30 07:49:35.188331682 +0000 UTC m=+960.831238094" watchObservedRunningTime="2025-09-30 07:49:35.188414144 +0000 UTC m=+960.831320556" Sep 30 07:49:36 crc kubenswrapper[4760]: I0930 07:49:36.181753 4760 generic.go:334] "Generic (PLEG): container finished" podID="cfaf0e86-3b68-4e5c-8caf-c60518a28016" containerID="727446bebc692489557f3343f13a804eceef42682c5207b5e90b2de083680a8d" exitCode=0 Sep 30 07:49:36 crc kubenswrapper[4760]: I0930 07:49:36.181857 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-4rxf2" event={"ID":"cfaf0e86-3b68-4e5c-8caf-c60518a28016","Type":"ContainerDied","Data":"727446bebc692489557f3343f13a804eceef42682c5207b5e90b2de083680a8d"} Sep 30 07:49:36 crc kubenswrapper[4760]: I0930 07:49:36.204778 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:36 crc kubenswrapper[4760]: I0930 07:49:36.204851 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:36 crc kubenswrapper[4760]: I0930 07:49:36.208343 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.078678 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/db4f0b34-3c4a-4c78-b284-5959e91b00c0-etc-swift\") pod \"swift-storage-0\" (UID: \"db4f0b34-3c4a-4c78-b284-5959e91b00c0\") " pod="openstack/swift-storage-0" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.088954 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/db4f0b34-3c4a-4c78-b284-5959e91b00c0-etc-swift\") pod \"swift-storage-0\" (UID: \"db4f0b34-3c4a-4c78-b284-5959e91b00c0\") " pod="openstack/swift-storage-0" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.192980 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.341833 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.644906 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-4rxf2" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.791916 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cfaf0e86-3b68-4e5c-8caf-c60518a28016-etc-swift\") pod \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.791986 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfaf0e86-3b68-4e5c-8caf-c60518a28016-combined-ca-bundle\") pod \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.792075 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cfaf0e86-3b68-4e5c-8caf-c60518a28016-ring-data-devices\") pod \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.792123 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cfaf0e86-3b68-4e5c-8caf-c60518a28016-dispersionconf\") pod \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.792171 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cfaf0e86-3b68-4e5c-8caf-c60518a28016-swiftconf\") pod \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.792725 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfaf0e86-3b68-4e5c-8caf-c60518a28016-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "cfaf0e86-3b68-4e5c-8caf-c60518a28016" (UID: "cfaf0e86-3b68-4e5c-8caf-c60518a28016"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.793151 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfaf0e86-3b68-4e5c-8caf-c60518a28016-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "cfaf0e86-3b68-4e5c-8caf-c60518a28016" (UID: "cfaf0e86-3b68-4e5c-8caf-c60518a28016"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.793210 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfaf0e86-3b68-4e5c-8caf-c60518a28016-scripts\") pod \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.793271 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf9t9\" (UniqueName: \"kubernetes.io/projected/cfaf0e86-3b68-4e5c-8caf-c60518a28016-kube-api-access-jf9t9\") pod \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\" (UID: \"cfaf0e86-3b68-4e5c-8caf-c60518a28016\") " Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.793697 4760 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cfaf0e86-3b68-4e5c-8caf-c60518a28016-etc-swift\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.793729 4760 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cfaf0e86-3b68-4e5c-8caf-c60518a28016-ring-data-devices\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.802249 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfaf0e86-3b68-4e5c-8caf-c60518a28016-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "cfaf0e86-3b68-4e5c-8caf-c60518a28016" (UID: "cfaf0e86-3b68-4e5c-8caf-c60518a28016"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.803798 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfaf0e86-3b68-4e5c-8caf-c60518a28016-kube-api-access-jf9t9" (OuterVolumeSpecName: "kube-api-access-jf9t9") pod "cfaf0e86-3b68-4e5c-8caf-c60518a28016" (UID: "cfaf0e86-3b68-4e5c-8caf-c60518a28016"). InnerVolumeSpecName "kube-api-access-jf9t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.818124 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfaf0e86-3b68-4e5c-8caf-c60518a28016-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "cfaf0e86-3b68-4e5c-8caf-c60518a28016" (UID: "cfaf0e86-3b68-4e5c-8caf-c60518a28016"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.821648 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfaf0e86-3b68-4e5c-8caf-c60518a28016-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfaf0e86-3b68-4e5c-8caf-c60518a28016" (UID: "cfaf0e86-3b68-4e5c-8caf-c60518a28016"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.826381 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfaf0e86-3b68-4e5c-8caf-c60518a28016-scripts" (OuterVolumeSpecName: "scripts") pod "cfaf0e86-3b68-4e5c-8caf-c60518a28016" (UID: "cfaf0e86-3b68-4e5c-8caf-c60518a28016"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.894526 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-d462-account-create-d7bk9"] Sep 30 07:49:37 crc kubenswrapper[4760]: E0930 07:49:37.895119 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfaf0e86-3b68-4e5c-8caf-c60518a28016" containerName="swift-ring-rebalance" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.895135 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfaf0e86-3b68-4e5c-8caf-c60518a28016" containerName="swift-ring-rebalance" Sep 30 07:49:37 crc kubenswrapper[4760]: E0930 07:49:37.895152 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60c66a34-554e-4655-8da8-e47e3e10a521" containerName="mariadb-database-create" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.895159 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="60c66a34-554e-4655-8da8-e47e3e10a521" containerName="mariadb-database-create" Sep 30 07:49:37 crc kubenswrapper[4760]: E0930 07:49:37.895172 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6258f1d-30e6-4da1-b567-e37509041f6f" containerName="init" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.895177 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6258f1d-30e6-4da1-b567-e37509041f6f" containerName="init" Sep 30 07:49:37 crc kubenswrapper[4760]: E0930 07:49:37.895192 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3bc4bc8-f5fa-4aa1-b24a-e742458d48ab" containerName="mariadb-database-create" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.895198 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3bc4bc8-f5fa-4aa1-b24a-e742458d48ab" containerName="mariadb-database-create" Sep 30 07:49:37 crc kubenswrapper[4760]: E0930 07:49:37.895207 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="636599f2-9be2-4380-b923-0f3b3b77e39b" containerName="mariadb-database-create" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.895213 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="636599f2-9be2-4380-b923-0f3b3b77e39b" containerName="mariadb-database-create" Sep 30 07:49:37 crc kubenswrapper[4760]: E0930 07:49:37.895227 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6258f1d-30e6-4da1-b567-e37509041f6f" containerName="dnsmasq-dns" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.895234 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6258f1d-30e6-4da1-b567-e37509041f6f" containerName="dnsmasq-dns" Sep 30 07:49:37 crc kubenswrapper[4760]: E0930 07:49:37.895244 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c941e149-5425-4ef2-9920-a7c6797230be" containerName="mariadb-database-create" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.895250 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="c941e149-5425-4ef2-9920-a7c6797230be" containerName="mariadb-database-create" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.895413 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfaf0e86-3b68-4e5c-8caf-c60518a28016" containerName="swift-ring-rebalance" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.895426 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="636599f2-9be2-4380-b923-0f3b3b77e39b" containerName="mariadb-database-create" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.895431 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfaf0e86-3b68-4e5c-8caf-c60518a28016-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.895473 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jf9t9\" (UniqueName: \"kubernetes.io/projected/cfaf0e86-3b68-4e5c-8caf-c60518a28016-kube-api-access-jf9t9\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.895485 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfaf0e86-3b68-4e5c-8caf-c60518a28016-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.895494 4760 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cfaf0e86-3b68-4e5c-8caf-c60518a28016-dispersionconf\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.895504 4760 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cfaf0e86-3b68-4e5c-8caf-c60518a28016-swiftconf\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.895452 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="60c66a34-554e-4655-8da8-e47e3e10a521" containerName="mariadb-database-create" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.895555 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6258f1d-30e6-4da1-b567-e37509041f6f" containerName="dnsmasq-dns" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.895595 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="c941e149-5425-4ef2-9920-a7c6797230be" containerName="mariadb-database-create" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.895620 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3bc4bc8-f5fa-4aa1-b24a-e742458d48ab" containerName="mariadb-database-create" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.896283 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d462-account-create-d7bk9" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.902899 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.910887 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d462-account-create-d7bk9"] Sep 30 07:49:37 crc kubenswrapper[4760]: I0930 07:49:37.996560 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bjd8\" (UniqueName: \"kubernetes.io/projected/ba2f3da0-95aa-40f3-b3cf-1b3d3a1ee6b6-kube-api-access-4bjd8\") pod \"keystone-d462-account-create-d7bk9\" (UID: \"ba2f3da0-95aa-40f3-b3cf-1b3d3a1ee6b6\") " pod="openstack/keystone-d462-account-create-d7bk9" Sep 30 07:49:38 crc kubenswrapper[4760]: I0930 07:49:38.000019 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Sep 30 07:49:38 crc kubenswrapper[4760]: W0930 07:49:38.009055 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb4f0b34_3c4a_4c78_b284_5959e91b00c0.slice/crio-114a8f821aba32c806907d6e3710c30d3df7a92c7a88990dd1d470fe167f9896 WatchSource:0}: Error finding container 114a8f821aba32c806907d6e3710c30d3df7a92c7a88990dd1d470fe167f9896: Status 404 returned error can't find the container with id 114a8f821aba32c806907d6e3710c30d3df7a92c7a88990dd1d470fe167f9896 Sep 30 07:49:38 crc kubenswrapper[4760]: I0930 07:49:38.098193 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bjd8\" (UniqueName: \"kubernetes.io/projected/ba2f3da0-95aa-40f3-b3cf-1b3d3a1ee6b6-kube-api-access-4bjd8\") pod \"keystone-d462-account-create-d7bk9\" (UID: \"ba2f3da0-95aa-40f3-b3cf-1b3d3a1ee6b6\") " pod="openstack/keystone-d462-account-create-d7bk9" Sep 30 07:49:38 crc kubenswrapper[4760]: I0930 07:49:38.120683 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bjd8\" (UniqueName: \"kubernetes.io/projected/ba2f3da0-95aa-40f3-b3cf-1b3d3a1ee6b6-kube-api-access-4bjd8\") pod \"keystone-d462-account-create-d7bk9\" (UID: \"ba2f3da0-95aa-40f3-b3cf-1b3d3a1ee6b6\") " pod="openstack/keystone-d462-account-create-d7bk9" Sep 30 07:49:38 crc kubenswrapper[4760]: I0930 07:49:38.201881 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4f0b34-3c4a-4c78-b284-5959e91b00c0","Type":"ContainerStarted","Data":"114a8f821aba32c806907d6e3710c30d3df7a92c7a88990dd1d470fe167f9896"} Sep 30 07:49:38 crc kubenswrapper[4760]: I0930 07:49:38.204212 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-4rxf2" Sep 30 07:49:38 crc kubenswrapper[4760]: I0930 07:49:38.204208 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-4rxf2" event={"ID":"cfaf0e86-3b68-4e5c-8caf-c60518a28016","Type":"ContainerDied","Data":"1a9fe96b94ff882a70c69f65fd9112134417c2162ed7da32f1f9d0442462a467"} Sep 30 07:49:38 crc kubenswrapper[4760]: I0930 07:49:38.204265 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a9fe96b94ff882a70c69f65fd9112134417c2162ed7da32f1f9d0442462a467" Sep 30 07:49:38 crc kubenswrapper[4760]: I0930 07:49:38.214627 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d462-account-create-d7bk9" Sep 30 07:49:38 crc kubenswrapper[4760]: I0930 07:49:38.251115 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-3371-account-create-dhxdh"] Sep 30 07:49:38 crc kubenswrapper[4760]: I0930 07:49:38.254063 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3371-account-create-dhxdh" Sep 30 07:49:38 crc kubenswrapper[4760]: I0930 07:49:38.257323 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Sep 30 07:49:38 crc kubenswrapper[4760]: I0930 07:49:38.293414 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3371-account-create-dhxdh"] Sep 30 07:49:38 crc kubenswrapper[4760]: I0930 07:49:38.417332 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trcb8\" (UniqueName: \"kubernetes.io/projected/dc5900f2-47ff-45f3-870e-5aff13eeb14f-kube-api-access-trcb8\") pod \"placement-3371-account-create-dhxdh\" (UID: \"dc5900f2-47ff-45f3-870e-5aff13eeb14f\") " pod="openstack/placement-3371-account-create-dhxdh" Sep 30 07:49:38 crc kubenswrapper[4760]: I0930 07:49:38.448227 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-8f95-account-create-zwzdl"] Sep 30 07:49:38 crc kubenswrapper[4760]: I0930 07:49:38.450088 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8f95-account-create-zwzdl" Sep 30 07:49:38 crc kubenswrapper[4760]: I0930 07:49:38.453737 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Sep 30 07:49:38 crc kubenswrapper[4760]: I0930 07:49:38.454214 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8f95-account-create-zwzdl"] Sep 30 07:49:38 crc kubenswrapper[4760]: I0930 07:49:38.521693 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trcb8\" (UniqueName: \"kubernetes.io/projected/dc5900f2-47ff-45f3-870e-5aff13eeb14f-kube-api-access-trcb8\") pod \"placement-3371-account-create-dhxdh\" (UID: \"dc5900f2-47ff-45f3-870e-5aff13eeb14f\") " pod="openstack/placement-3371-account-create-dhxdh" Sep 30 07:49:38 crc kubenswrapper[4760]: I0930 07:49:38.543062 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trcb8\" (UniqueName: \"kubernetes.io/projected/dc5900f2-47ff-45f3-870e-5aff13eeb14f-kube-api-access-trcb8\") pod \"placement-3371-account-create-dhxdh\" (UID: \"dc5900f2-47ff-45f3-870e-5aff13eeb14f\") " pod="openstack/placement-3371-account-create-dhxdh" Sep 30 07:49:38 crc kubenswrapper[4760]: I0930 07:49:38.624102 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg9nc\" (UniqueName: \"kubernetes.io/projected/225030d7-c116-4040-9b8f-69ad4d2e7a57-kube-api-access-wg9nc\") pod \"glance-8f95-account-create-zwzdl\" (UID: \"225030d7-c116-4040-9b8f-69ad4d2e7a57\") " pod="openstack/glance-8f95-account-create-zwzdl" Sep 30 07:49:38 crc kubenswrapper[4760]: I0930 07:49:38.641876 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3371-account-create-dhxdh" Sep 30 07:49:38 crc kubenswrapper[4760]: I0930 07:49:38.714978 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d462-account-create-d7bk9"] Sep 30 07:49:38 crc kubenswrapper[4760]: I0930 07:49:38.725897 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg9nc\" (UniqueName: \"kubernetes.io/projected/225030d7-c116-4040-9b8f-69ad4d2e7a57-kube-api-access-wg9nc\") pod \"glance-8f95-account-create-zwzdl\" (UID: \"225030d7-c116-4040-9b8f-69ad4d2e7a57\") " pod="openstack/glance-8f95-account-create-zwzdl" Sep 30 07:49:38 crc kubenswrapper[4760]: I0930 07:49:38.745280 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg9nc\" (UniqueName: \"kubernetes.io/projected/225030d7-c116-4040-9b8f-69ad4d2e7a57-kube-api-access-wg9nc\") pod \"glance-8f95-account-create-zwzdl\" (UID: \"225030d7-c116-4040-9b8f-69ad4d2e7a57\") " pod="openstack/glance-8f95-account-create-zwzdl" Sep 30 07:49:38 crc kubenswrapper[4760]: I0930 07:49:38.771655 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8f95-account-create-zwzdl" Sep 30 07:49:39 crc kubenswrapper[4760]: I0930 07:49:39.125285 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3371-account-create-dhxdh"] Sep 30 07:49:39 crc kubenswrapper[4760]: I0930 07:49:39.230814 4760 generic.go:334] "Generic (PLEG): container finished" podID="ba2f3da0-95aa-40f3-b3cf-1b3d3a1ee6b6" containerID="06a3614cc990478bd72bb96c0f784d252bbca1242a48e48f899d2babf860b744" exitCode=0 Sep 30 07:49:39 crc kubenswrapper[4760]: I0930 07:49:39.230856 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d462-account-create-d7bk9" event={"ID":"ba2f3da0-95aa-40f3-b3cf-1b3d3a1ee6b6","Type":"ContainerDied","Data":"06a3614cc990478bd72bb96c0f784d252bbca1242a48e48f899d2babf860b744"} Sep 30 07:49:39 crc kubenswrapper[4760]: I0930 07:49:39.230881 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d462-account-create-d7bk9" event={"ID":"ba2f3da0-95aa-40f3-b3cf-1b3d3a1ee6b6","Type":"ContainerStarted","Data":"9edebd48d167fc0ba535cecb70bd6c97bc62933150327ec579c5a864893244b5"} Sep 30 07:49:39 crc kubenswrapper[4760]: I0930 07:49:39.265701 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8f95-account-create-zwzdl"] Sep 30 07:49:39 crc kubenswrapper[4760]: I0930 07:49:39.728001 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 07:49:39 crc kubenswrapper[4760]: I0930 07:49:39.728532 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="007888b6-d5c6-410a-955a-ed78adf759bd" containerName="prometheus" containerID="cri-o://50b776945742176dfd29a22633c991f8a4cdffc58f622d83331aa0413823ef59" gracePeriod=600 Sep 30 07:49:39 crc kubenswrapper[4760]: I0930 07:49:39.728644 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="007888b6-d5c6-410a-955a-ed78adf759bd" containerName="config-reloader" containerID="cri-o://a2cea33f1b45cef9414324dcb379d8628760022a434c051e678c5552ac3a7b76" gracePeriod=600 Sep 30 07:49:39 crc kubenswrapper[4760]: I0930 07:49:39.728869 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="007888b6-d5c6-410a-955a-ed78adf759bd" containerName="thanos-sidecar" containerID="cri-o://c903e52356aaa141261bf2b36b8b1e590d0f7e4e8cd06608287d1235be886d86" gracePeriod=600 Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.244237 4760 generic.go:334] "Generic (PLEG): container finished" podID="007888b6-d5c6-410a-955a-ed78adf759bd" containerID="c903e52356aaa141261bf2b36b8b1e590d0f7e4e8cd06608287d1235be886d86" exitCode=0 Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.244264 4760 generic.go:334] "Generic (PLEG): container finished" podID="007888b6-d5c6-410a-955a-ed78adf759bd" containerID="a2cea33f1b45cef9414324dcb379d8628760022a434c051e678c5552ac3a7b76" exitCode=0 Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.244274 4760 generic.go:334] "Generic (PLEG): container finished" podID="007888b6-d5c6-410a-955a-ed78adf759bd" containerID="50b776945742176dfd29a22633c991f8a4cdffc58f622d83331aa0413823ef59" exitCode=0 Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.244347 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"007888b6-d5c6-410a-955a-ed78adf759bd","Type":"ContainerDied","Data":"c903e52356aaa141261bf2b36b8b1e590d0f7e4e8cd06608287d1235be886d86"} Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.244397 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"007888b6-d5c6-410a-955a-ed78adf759bd","Type":"ContainerDied","Data":"a2cea33f1b45cef9414324dcb379d8628760022a434c051e678c5552ac3a7b76"} Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.244418 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"007888b6-d5c6-410a-955a-ed78adf759bd","Type":"ContainerDied","Data":"50b776945742176dfd29a22633c991f8a4cdffc58f622d83331aa0413823ef59"} Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.246038 4760 generic.go:334] "Generic (PLEG): container finished" podID="dc5900f2-47ff-45f3-870e-5aff13eeb14f" containerID="78af48d95aa791006f27e6464dbb8f326f3071dc5a67d47fd213d189043f1128" exitCode=0 Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.246072 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3371-account-create-dhxdh" event={"ID":"dc5900f2-47ff-45f3-870e-5aff13eeb14f","Type":"ContainerDied","Data":"78af48d95aa791006f27e6464dbb8f326f3071dc5a67d47fd213d189043f1128"} Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.246111 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3371-account-create-dhxdh" event={"ID":"dc5900f2-47ff-45f3-870e-5aff13eeb14f","Type":"ContainerStarted","Data":"82ac8b4b1421d8e077a0b81e00bef8100b479aefbe568c0d88801b1dea01082d"} Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.249405 4760 generic.go:334] "Generic (PLEG): container finished" podID="225030d7-c116-4040-9b8f-69ad4d2e7a57" containerID="7e853395a6f92a9d0cf4dcbc0d905e87c7273008f6710570a6fc90f693e13d50" exitCode=0 Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.249468 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8f95-account-create-zwzdl" event={"ID":"225030d7-c116-4040-9b8f-69ad4d2e7a57","Type":"ContainerDied","Data":"7e853395a6f92a9d0cf4dcbc0d905e87c7273008f6710570a6fc90f693e13d50"} Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.249488 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8f95-account-create-zwzdl" event={"ID":"225030d7-c116-4040-9b8f-69ad4d2e7a57","Type":"ContainerStarted","Data":"22902b55af003ae5abdf5aa80283cf86c7583f99f7b80763c8571ea735417ada"} Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.251898 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4f0b34-3c4a-4c78-b284-5959e91b00c0","Type":"ContainerStarted","Data":"c0665d8ce5177a314c86c396963cba0b3e7c882119824db997e0abe7e4fa1021"} Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.251944 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4f0b34-3c4a-4c78-b284-5959e91b00c0","Type":"ContainerStarted","Data":"1cd15d301cc35b2feb19f51852c37585a38fda370a8c3a89cd330e60f95065cc"} Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.724210 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d462-account-create-d7bk9" Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.754088 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.859859 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/007888b6-d5c6-410a-955a-ed78adf759bd-prometheus-metric-storage-rulefiles-0\") pod \"007888b6-d5c6-410a-955a-ed78adf759bd\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.860002 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\") pod \"007888b6-d5c6-410a-955a-ed78adf759bd\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.860031 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/007888b6-d5c6-410a-955a-ed78adf759bd-tls-assets\") pod \"007888b6-d5c6-410a-955a-ed78adf759bd\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.860055 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bjd8\" (UniqueName: \"kubernetes.io/projected/ba2f3da0-95aa-40f3-b3cf-1b3d3a1ee6b6-kube-api-access-4bjd8\") pod \"ba2f3da0-95aa-40f3-b3cf-1b3d3a1ee6b6\" (UID: \"ba2f3da0-95aa-40f3-b3cf-1b3d3a1ee6b6\") " Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.860111 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/007888b6-d5c6-410a-955a-ed78adf759bd-config-out\") pod \"007888b6-d5c6-410a-955a-ed78adf759bd\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.860135 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/007888b6-d5c6-410a-955a-ed78adf759bd-thanos-prometheus-http-client-file\") pod \"007888b6-d5c6-410a-955a-ed78adf759bd\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.860160 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/007888b6-d5c6-410a-955a-ed78adf759bd-config\") pod \"007888b6-d5c6-410a-955a-ed78adf759bd\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.860200 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzlmc\" (UniqueName: \"kubernetes.io/projected/007888b6-d5c6-410a-955a-ed78adf759bd-kube-api-access-mzlmc\") pod \"007888b6-d5c6-410a-955a-ed78adf759bd\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.860343 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/007888b6-d5c6-410a-955a-ed78adf759bd-web-config\") pod \"007888b6-d5c6-410a-955a-ed78adf759bd\" (UID: \"007888b6-d5c6-410a-955a-ed78adf759bd\") " Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.861069 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/007888b6-d5c6-410a-955a-ed78adf759bd-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "007888b6-d5c6-410a-955a-ed78adf759bd" (UID: "007888b6-d5c6-410a-955a-ed78adf759bd"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.866349 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007888b6-d5c6-410a-955a-ed78adf759bd-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "007888b6-d5c6-410a-955a-ed78adf759bd" (UID: "007888b6-d5c6-410a-955a-ed78adf759bd"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.866682 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/007888b6-d5c6-410a-955a-ed78adf759bd-kube-api-access-mzlmc" (OuterVolumeSpecName: "kube-api-access-mzlmc") pod "007888b6-d5c6-410a-955a-ed78adf759bd" (UID: "007888b6-d5c6-410a-955a-ed78adf759bd"). InnerVolumeSpecName "kube-api-access-mzlmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.866718 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/007888b6-d5c6-410a-955a-ed78adf759bd-config-out" (OuterVolumeSpecName: "config-out") pod "007888b6-d5c6-410a-955a-ed78adf759bd" (UID: "007888b6-d5c6-410a-955a-ed78adf759bd"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.866920 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007888b6-d5c6-410a-955a-ed78adf759bd-config" (OuterVolumeSpecName: "config") pod "007888b6-d5c6-410a-955a-ed78adf759bd" (UID: "007888b6-d5c6-410a-955a-ed78adf759bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.867218 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba2f3da0-95aa-40f3-b3cf-1b3d3a1ee6b6-kube-api-access-4bjd8" (OuterVolumeSpecName: "kube-api-access-4bjd8") pod "ba2f3da0-95aa-40f3-b3cf-1b3d3a1ee6b6" (UID: "ba2f3da0-95aa-40f3-b3cf-1b3d3a1ee6b6"). InnerVolumeSpecName "kube-api-access-4bjd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.868174 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/007888b6-d5c6-410a-955a-ed78adf759bd-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "007888b6-d5c6-410a-955a-ed78adf759bd" (UID: "007888b6-d5c6-410a-955a-ed78adf759bd"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.877971 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09238572-8d9b-4684-8ed8-661e43a35d9a" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "007888b6-d5c6-410a-955a-ed78adf759bd" (UID: "007888b6-d5c6-410a-955a-ed78adf759bd"). InnerVolumeSpecName "pvc-09238572-8d9b-4684-8ed8-661e43a35d9a". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.895770 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007888b6-d5c6-410a-955a-ed78adf759bd-web-config" (OuterVolumeSpecName: "web-config") pod "007888b6-d5c6-410a-955a-ed78adf759bd" (UID: "007888b6-d5c6-410a-955a-ed78adf759bd"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.961933 4760 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/007888b6-d5c6-410a-955a-ed78adf759bd-web-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.961974 4760 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/007888b6-d5c6-410a-955a-ed78adf759bd-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.962016 4760 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\") on node \"crc\" " Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.962029 4760 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/007888b6-d5c6-410a-955a-ed78adf759bd-tls-assets\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.962041 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bjd8\" (UniqueName: \"kubernetes.io/projected/ba2f3da0-95aa-40f3-b3cf-1b3d3a1ee6b6-kube-api-access-4bjd8\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.962049 4760 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/007888b6-d5c6-410a-955a-ed78adf759bd-config-out\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.962058 4760 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/007888b6-d5c6-410a-955a-ed78adf759bd-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.962068 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/007888b6-d5c6-410a-955a-ed78adf759bd-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.962076 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzlmc\" (UniqueName: \"kubernetes.io/projected/007888b6-d5c6-410a-955a-ed78adf759bd-kube-api-access-mzlmc\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.982976 4760 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Sep 30 07:49:40 crc kubenswrapper[4760]: I0930 07:49:40.983380 4760 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-09238572-8d9b-4684-8ed8-661e43a35d9a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09238572-8d9b-4684-8ed8-661e43a35d9a") on node "crc" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.064129 4760 reconciler_common.go:293] "Volume detached for volume \"pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.276946 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"007888b6-d5c6-410a-955a-ed78adf759bd","Type":"ContainerDied","Data":"7b3066eb4677a9d3e7086f1578421e5208f678caca29d8e696ff03175933c15b"} Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.277082 4760 scope.go:117] "RemoveContainer" containerID="c903e52356aaa141261bf2b36b8b1e590d0f7e4e8cd06608287d1235be886d86" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.278468 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.283675 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d462-account-create-d7bk9" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.283748 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d462-account-create-d7bk9" event={"ID":"ba2f3da0-95aa-40f3-b3cf-1b3d3a1ee6b6","Type":"ContainerDied","Data":"9edebd48d167fc0ba535cecb70bd6c97bc62933150327ec579c5a864893244b5"} Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.283798 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9edebd48d167fc0ba535cecb70bd6c97bc62933150327ec579c5a864893244b5" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.287566 4760 generic.go:334] "Generic (PLEG): container finished" podID="888bbd15-0d32-47ca-9f81-94eaf8f3c4df" containerID="039b885c04db48743019cb8fd332d719e0236e3f107ba92ef62a4f193fe33d92" exitCode=0 Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.287651 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"888bbd15-0d32-47ca-9f81-94eaf8f3c4df","Type":"ContainerDied","Data":"039b885c04db48743019cb8fd332d719e0236e3f107ba92ef62a4f193fe33d92"} Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.291163 4760 generic.go:334] "Generic (PLEG): container finished" podID="82b71e6c-ab34-447e-87e0-a95a9f070efe" containerID="6b118dc533d1475f2056129842cbda4e9708447c504c86b0c22d38fcd2a5b9b2" exitCode=0 Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.291382 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"82b71e6c-ab34-447e-87e0-a95a9f070efe","Type":"ContainerDied","Data":"6b118dc533d1475f2056129842cbda4e9708447c504c86b0c22d38fcd2a5b9b2"} Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.330261 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.330822 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4f0b34-3c4a-4c78-b284-5959e91b00c0","Type":"ContainerStarted","Data":"3f0c3a9645d36eb11d10f89ecc75cdcfbffcbe15d86a123d9e19e8ef61553478"} Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.330876 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4f0b34-3c4a-4c78-b284-5959e91b00c0","Type":"ContainerStarted","Data":"06e49bb9842125d4ee7412155d106b5ef2ceac2b5093d593593b7fc611c9b980"} Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.335022 4760 scope.go:117] "RemoveContainer" containerID="a2cea33f1b45cef9414324dcb379d8628760022a434c051e678c5552ac3a7b76" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.335984 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.371183 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 07:49:41 crc kubenswrapper[4760]: E0930 07:49:41.371884 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2f3da0-95aa-40f3-b3cf-1b3d3a1ee6b6" containerName="mariadb-account-create" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.371901 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2f3da0-95aa-40f3-b3cf-1b3d3a1ee6b6" containerName="mariadb-account-create" Sep 30 07:49:41 crc kubenswrapper[4760]: E0930 07:49:41.371933 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007888b6-d5c6-410a-955a-ed78adf759bd" containerName="config-reloader" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.371940 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="007888b6-d5c6-410a-955a-ed78adf759bd" containerName="config-reloader" Sep 30 07:49:41 crc kubenswrapper[4760]: E0930 07:49:41.371953 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007888b6-d5c6-410a-955a-ed78adf759bd" containerName="init-config-reloader" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.371960 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="007888b6-d5c6-410a-955a-ed78adf759bd" containerName="init-config-reloader" Sep 30 07:49:41 crc kubenswrapper[4760]: E0930 07:49:41.371970 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007888b6-d5c6-410a-955a-ed78adf759bd" containerName="thanos-sidecar" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.371978 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="007888b6-d5c6-410a-955a-ed78adf759bd" containerName="thanos-sidecar" Sep 30 07:49:41 crc kubenswrapper[4760]: E0930 07:49:41.372007 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007888b6-d5c6-410a-955a-ed78adf759bd" containerName="prometheus" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.372013 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="007888b6-d5c6-410a-955a-ed78adf759bd" containerName="prometheus" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.372689 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="007888b6-d5c6-410a-955a-ed78adf759bd" containerName="prometheus" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.372717 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="007888b6-d5c6-410a-955a-ed78adf759bd" containerName="config-reloader" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.372732 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba2f3da0-95aa-40f3-b3cf-1b3d3a1ee6b6" containerName="mariadb-account-create" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.372768 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="007888b6-d5c6-410a-955a-ed78adf759bd" containerName="thanos-sidecar" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.374459 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.378192 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.378646 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.379253 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-pr2wc" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.379418 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.379557 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.380209 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.387744 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.409414 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.471376 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.471637 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-config\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.471752 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.471882 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9784a9e1-42ad-4f0b-ae43-a3227158a763-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.472085 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9784a9e1-42ad-4f0b-ae43-a3227158a763-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.472288 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9784a9e1-42ad-4f0b-ae43-a3227158a763-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.472415 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.472510 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.472579 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9pv4\" (UniqueName: \"kubernetes.io/projected/9784a9e1-42ad-4f0b-ae43-a3227158a763-kube-api-access-x9pv4\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.472727 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.472822 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.540912 4760 scope.go:117] "RemoveContainer" containerID="50b776945742176dfd29a22633c991f8a4cdffc58f622d83331aa0413823ef59" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.566956 4760 scope.go:117] "RemoveContainer" containerID="872fb383f092e1b7cf31bdcc5e21d2fae1b56e52665f0a314c9ddddbad403eef" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.574450 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9784a9e1-42ad-4f0b-ae43-a3227158a763-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.574531 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.574575 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.574607 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9pv4\" (UniqueName: \"kubernetes.io/projected/9784a9e1-42ad-4f0b-ae43-a3227158a763-kube-api-access-x9pv4\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.574642 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.574680 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.574743 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.574775 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-config\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.574812 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.574842 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9784a9e1-42ad-4f0b-ae43-a3227158a763-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.574894 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9784a9e1-42ad-4f0b-ae43-a3227158a763-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.580233 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.580775 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9784a9e1-42ad-4f0b-ae43-a3227158a763-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.581393 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.582550 4760 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.582572 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/15f1bcf6ef2a65343cb29c53094f20376472cc1b8d5a343d6a63d664da0c3f7a/globalmount\"" pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.583548 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.585743 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.588117 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.588932 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9784a9e1-42ad-4f0b-ae43-a3227158a763-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.589044 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9784a9e1-42ad-4f0b-ae43-a3227158a763-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.589544 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-config\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.601943 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9pv4\" (UniqueName: \"kubernetes.io/projected/9784a9e1-42ad-4f0b-ae43-a3227158a763-kube-api-access-x9pv4\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.613095 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\") pod \"prometheus-metric-storage-0\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:41 crc kubenswrapper[4760]: I0930 07:49:41.851367 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 07:49:42 crc kubenswrapper[4760]: I0930 07:49:42.029644 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8f95-account-create-zwzdl" Sep 30 07:49:42 crc kubenswrapper[4760]: I0930 07:49:42.033152 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3371-account-create-dhxdh" Sep 30 07:49:42 crc kubenswrapper[4760]: I0930 07:49:42.101922 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wg9nc\" (UniqueName: \"kubernetes.io/projected/225030d7-c116-4040-9b8f-69ad4d2e7a57-kube-api-access-wg9nc\") pod \"225030d7-c116-4040-9b8f-69ad4d2e7a57\" (UID: \"225030d7-c116-4040-9b8f-69ad4d2e7a57\") " Sep 30 07:49:42 crc kubenswrapper[4760]: I0930 07:49:42.102052 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trcb8\" (UniqueName: \"kubernetes.io/projected/dc5900f2-47ff-45f3-870e-5aff13eeb14f-kube-api-access-trcb8\") pod \"dc5900f2-47ff-45f3-870e-5aff13eeb14f\" (UID: \"dc5900f2-47ff-45f3-870e-5aff13eeb14f\") " Sep 30 07:49:42 crc kubenswrapper[4760]: I0930 07:49:42.106635 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc5900f2-47ff-45f3-870e-5aff13eeb14f-kube-api-access-trcb8" (OuterVolumeSpecName: "kube-api-access-trcb8") pod "dc5900f2-47ff-45f3-870e-5aff13eeb14f" (UID: "dc5900f2-47ff-45f3-870e-5aff13eeb14f"). InnerVolumeSpecName "kube-api-access-trcb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:49:42 crc kubenswrapper[4760]: I0930 07:49:42.106754 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/225030d7-c116-4040-9b8f-69ad4d2e7a57-kube-api-access-wg9nc" (OuterVolumeSpecName: "kube-api-access-wg9nc") pod "225030d7-c116-4040-9b8f-69ad4d2e7a57" (UID: "225030d7-c116-4040-9b8f-69ad4d2e7a57"). InnerVolumeSpecName "kube-api-access-wg9nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:49:42 crc kubenswrapper[4760]: I0930 07:49:42.207351 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wg9nc\" (UniqueName: \"kubernetes.io/projected/225030d7-c116-4040-9b8f-69ad4d2e7a57-kube-api-access-wg9nc\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:42 crc kubenswrapper[4760]: I0930 07:49:42.207413 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trcb8\" (UniqueName: \"kubernetes.io/projected/dc5900f2-47ff-45f3-870e-5aff13eeb14f-kube-api-access-trcb8\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:42 crc kubenswrapper[4760]: I0930 07:49:42.348600 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"82b71e6c-ab34-447e-87e0-a95a9f070efe","Type":"ContainerStarted","Data":"86110cd0f7f1cd09512744889040451c585b3eae2d496a2583a787962daaf5e9"} Sep 30 07:49:42 crc kubenswrapper[4760]: I0930 07:49:42.349166 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 30 07:49:42 crc kubenswrapper[4760]: I0930 07:49:42.353354 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8f95-account-create-zwzdl" Sep 30 07:49:42 crc kubenswrapper[4760]: I0930 07:49:42.353377 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8f95-account-create-zwzdl" event={"ID":"225030d7-c116-4040-9b8f-69ad4d2e7a57","Type":"ContainerDied","Data":"22902b55af003ae5abdf5aa80283cf86c7583f99f7b80763c8571ea735417ada"} Sep 30 07:49:42 crc kubenswrapper[4760]: I0930 07:49:42.353411 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22902b55af003ae5abdf5aa80283cf86c7583f99f7b80763c8571ea735417ada" Sep 30 07:49:42 crc kubenswrapper[4760]: I0930 07:49:42.355479 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3371-account-create-dhxdh" event={"ID":"dc5900f2-47ff-45f3-870e-5aff13eeb14f","Type":"ContainerDied","Data":"82ac8b4b1421d8e077a0b81e00bef8100b479aefbe568c0d88801b1dea01082d"} Sep 30 07:49:42 crc kubenswrapper[4760]: I0930 07:49:42.355502 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82ac8b4b1421d8e077a0b81e00bef8100b479aefbe568c0d88801b1dea01082d" Sep 30 07:49:42 crc kubenswrapper[4760]: I0930 07:49:42.355552 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3371-account-create-dhxdh" Sep 30 07:49:42 crc kubenswrapper[4760]: I0930 07:49:42.373066 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"888bbd15-0d32-47ca-9f81-94eaf8f3c4df","Type":"ContainerStarted","Data":"9301f648fb76f95a8ef0ac897f3fc18acc757fd88a0050167f03a86da422b738"} Sep 30 07:49:42 crc kubenswrapper[4760]: I0930 07:49:42.373288 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:49:42 crc kubenswrapper[4760]: I0930 07:49:42.412166 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=50.915252998 podStartE2EDuration="59.412069919s" podCreationTimestamp="2025-09-30 07:48:43 +0000 UTC" firstStartedPulling="2025-09-30 07:49:00.701946815 +0000 UTC m=+926.344853227" lastFinishedPulling="2025-09-30 07:49:09.198763736 +0000 UTC m=+934.841670148" observedRunningTime="2025-09-30 07:49:42.395443565 +0000 UTC m=+968.038349987" watchObservedRunningTime="2025-09-30 07:49:42.412069919 +0000 UTC m=+968.054976331" Sep 30 07:49:42 crc kubenswrapper[4760]: I0930 07:49:42.432677 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=50.467651451 podStartE2EDuration="59.432657824s" podCreationTimestamp="2025-09-30 07:48:43 +0000 UTC" firstStartedPulling="2025-09-30 07:48:59.850119141 +0000 UTC m=+925.493025553" lastFinishedPulling="2025-09-30 07:49:08.815125504 +0000 UTC m=+934.458031926" observedRunningTime="2025-09-30 07:49:42.41760754 +0000 UTC m=+968.060513952" watchObservedRunningTime="2025-09-30 07:49:42.432657824 +0000 UTC m=+968.075564236" Sep 30 07:49:42 crc kubenswrapper[4760]: I0930 07:49:42.670756 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 07:49:42 crc kubenswrapper[4760]: W0930 07:49:42.676532 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9784a9e1_42ad_4f0b_ae43_a3227158a763.slice/crio-0c2f5b551d5f84405c9171a006c64b79e11468c1368b3e3e1b0e909242737498 WatchSource:0}: Error finding container 0c2f5b551d5f84405c9171a006c64b79e11468c1368b3e3e1b0e909242737498: Status 404 returned error can't find the container with id 0c2f5b551d5f84405c9171a006c64b79e11468c1368b3e3e1b0e909242737498 Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.076361 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="007888b6-d5c6-410a-955a-ed78adf759bd" path="/var/lib/kubelet/pods/007888b6-d5c6-410a-955a-ed78adf759bd/volumes" Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.387246 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4f0b34-3c4a-4c78-b284-5959e91b00c0","Type":"ContainerStarted","Data":"32fe1ab2324e5e67c9227adcb912b5d79ca832c47886919892c866005ef28100"} Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.387627 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4f0b34-3c4a-4c78-b284-5959e91b00c0","Type":"ContainerStarted","Data":"7ae048fc7a7f98219adf925c0953e9d1aed63ae786aef0e83dc82e6f781dd2bd"} Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.387642 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4f0b34-3c4a-4c78-b284-5959e91b00c0","Type":"ContainerStarted","Data":"bd64c7b419fb8831a43ce5fef3280e52f27d1f9ad53585104409ddba2e93e492"} Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.387656 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4f0b34-3c4a-4c78-b284-5959e91b00c0","Type":"ContainerStarted","Data":"3586329aa9cc148436908529f95785648db50ac13ba965fba95e0b808d42e850"} Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.389074 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9784a9e1-42ad-4f0b-ae43-a3227158a763","Type":"ContainerStarted","Data":"0c2f5b551d5f84405c9171a006c64b79e11468c1368b3e3e1b0e909242737498"} Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.545352 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-s45t8"] Sep 30 07:49:43 crc kubenswrapper[4760]: E0930 07:49:43.545759 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="225030d7-c116-4040-9b8f-69ad4d2e7a57" containerName="mariadb-account-create" Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.545781 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="225030d7-c116-4040-9b8f-69ad4d2e7a57" containerName="mariadb-account-create" Sep 30 07:49:43 crc kubenswrapper[4760]: E0930 07:49:43.545814 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc5900f2-47ff-45f3-870e-5aff13eeb14f" containerName="mariadb-account-create" Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.545821 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc5900f2-47ff-45f3-870e-5aff13eeb14f" containerName="mariadb-account-create" Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.546045 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc5900f2-47ff-45f3-870e-5aff13eeb14f" containerName="mariadb-account-create" Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.546074 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="225030d7-c116-4040-9b8f-69ad4d2e7a57" containerName="mariadb-account-create" Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.546743 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s45t8" Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.550531 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.550696 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lnkxp" Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.556533 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-s45t8"] Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.632683 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09c9e208-48cc-44b8-9810-fc0cf69cea8a-config-data\") pod \"glance-db-sync-s45t8\" (UID: \"09c9e208-48cc-44b8-9810-fc0cf69cea8a\") " pod="openstack/glance-db-sync-s45t8" Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.632966 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwqgz\" (UniqueName: \"kubernetes.io/projected/09c9e208-48cc-44b8-9810-fc0cf69cea8a-kube-api-access-dwqgz\") pod \"glance-db-sync-s45t8\" (UID: \"09c9e208-48cc-44b8-9810-fc0cf69cea8a\") " pod="openstack/glance-db-sync-s45t8" Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.633115 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09c9e208-48cc-44b8-9810-fc0cf69cea8a-combined-ca-bundle\") pod \"glance-db-sync-s45t8\" (UID: \"09c9e208-48cc-44b8-9810-fc0cf69cea8a\") " pod="openstack/glance-db-sync-s45t8" Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.633242 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/09c9e208-48cc-44b8-9810-fc0cf69cea8a-db-sync-config-data\") pod \"glance-db-sync-s45t8\" (UID: \"09c9e208-48cc-44b8-9810-fc0cf69cea8a\") " pod="openstack/glance-db-sync-s45t8" Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.734426 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09c9e208-48cc-44b8-9810-fc0cf69cea8a-config-data\") pod \"glance-db-sync-s45t8\" (UID: \"09c9e208-48cc-44b8-9810-fc0cf69cea8a\") " pod="openstack/glance-db-sync-s45t8" Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.734502 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwqgz\" (UniqueName: \"kubernetes.io/projected/09c9e208-48cc-44b8-9810-fc0cf69cea8a-kube-api-access-dwqgz\") pod \"glance-db-sync-s45t8\" (UID: \"09c9e208-48cc-44b8-9810-fc0cf69cea8a\") " pod="openstack/glance-db-sync-s45t8" Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.734583 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09c9e208-48cc-44b8-9810-fc0cf69cea8a-combined-ca-bundle\") pod \"glance-db-sync-s45t8\" (UID: \"09c9e208-48cc-44b8-9810-fc0cf69cea8a\") " pod="openstack/glance-db-sync-s45t8" Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.734667 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/09c9e208-48cc-44b8-9810-fc0cf69cea8a-db-sync-config-data\") pod \"glance-db-sync-s45t8\" (UID: \"09c9e208-48cc-44b8-9810-fc0cf69cea8a\") " pod="openstack/glance-db-sync-s45t8" Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.739224 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/09c9e208-48cc-44b8-9810-fc0cf69cea8a-db-sync-config-data\") pod \"glance-db-sync-s45t8\" (UID: \"09c9e208-48cc-44b8-9810-fc0cf69cea8a\") " pod="openstack/glance-db-sync-s45t8" Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.740736 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09c9e208-48cc-44b8-9810-fc0cf69cea8a-config-data\") pod \"glance-db-sync-s45t8\" (UID: \"09c9e208-48cc-44b8-9810-fc0cf69cea8a\") " pod="openstack/glance-db-sync-s45t8" Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.754095 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09c9e208-48cc-44b8-9810-fc0cf69cea8a-combined-ca-bundle\") pod \"glance-db-sync-s45t8\" (UID: \"09c9e208-48cc-44b8-9810-fc0cf69cea8a\") " pod="openstack/glance-db-sync-s45t8" Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.758235 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwqgz\" (UniqueName: \"kubernetes.io/projected/09c9e208-48cc-44b8-9810-fc0cf69cea8a-kube-api-access-dwqgz\") pod \"glance-db-sync-s45t8\" (UID: \"09c9e208-48cc-44b8-9810-fc0cf69cea8a\") " pod="openstack/glance-db-sync-s45t8" Sep 30 07:49:43 crc kubenswrapper[4760]: I0930 07:49:43.862710 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s45t8" Sep 30 07:49:44 crc kubenswrapper[4760]: I0930 07:49:44.506783 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-s45t8"] Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.225029 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-56wgh" podUID="159ee554-1b62-4fe3-95c6-e64ab0c58b2d" containerName="ovn-controller" probeResult="failure" output=< Sep 30 07:49:45 crc kubenswrapper[4760]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Sep 30 07:49:45 crc kubenswrapper[4760]: > Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.235271 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bwrv9" Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.272345 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bwrv9" Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.410880 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4f0b34-3c4a-4c78-b284-5959e91b00c0","Type":"ContainerStarted","Data":"1ddc54eadba1f8db351b8f7753e4eec35547fc44b2d3554d5e6bcf604ac76044"} Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.410916 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4f0b34-3c4a-4c78-b284-5959e91b00c0","Type":"ContainerStarted","Data":"d2fdce7d76932b9d6379d3ea40eca85089938ea559b5f156bcd9bcbfffa720fb"} Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.412357 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s45t8" event={"ID":"09c9e208-48cc-44b8-9810-fc0cf69cea8a","Type":"ContainerStarted","Data":"9edb36d37d1de1d060c04d48b325e55d5eb826acf2c9a3f57c632746c53371c8"} Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.512902 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-56wgh-config-dghqz"] Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.514361 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-56wgh-config-dghqz" Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.517501 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.524080 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-56wgh-config-dghqz"] Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.564713 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f6cfefc-4134-42d8-91fe-ebae0bc56801-scripts\") pod \"ovn-controller-56wgh-config-dghqz\" (UID: \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\") " pod="openstack/ovn-controller-56wgh-config-dghqz" Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.564799 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3f6cfefc-4134-42d8-91fe-ebae0bc56801-var-run\") pod \"ovn-controller-56wgh-config-dghqz\" (UID: \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\") " pod="openstack/ovn-controller-56wgh-config-dghqz" Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.564996 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3f6cfefc-4134-42d8-91fe-ebae0bc56801-additional-scripts\") pod \"ovn-controller-56wgh-config-dghqz\" (UID: \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\") " pod="openstack/ovn-controller-56wgh-config-dghqz" Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.565091 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb79v\" (UniqueName: \"kubernetes.io/projected/3f6cfefc-4134-42d8-91fe-ebae0bc56801-kube-api-access-tb79v\") pod \"ovn-controller-56wgh-config-dghqz\" (UID: \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\") " pod="openstack/ovn-controller-56wgh-config-dghqz" Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.565126 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3f6cfefc-4134-42d8-91fe-ebae0bc56801-var-run-ovn\") pod \"ovn-controller-56wgh-config-dghqz\" (UID: \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\") " pod="openstack/ovn-controller-56wgh-config-dghqz" Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.565225 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3f6cfefc-4134-42d8-91fe-ebae0bc56801-var-log-ovn\") pod \"ovn-controller-56wgh-config-dghqz\" (UID: \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\") " pod="openstack/ovn-controller-56wgh-config-dghqz" Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.666471 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3f6cfefc-4134-42d8-91fe-ebae0bc56801-var-log-ovn\") pod \"ovn-controller-56wgh-config-dghqz\" (UID: \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\") " pod="openstack/ovn-controller-56wgh-config-dghqz" Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.666549 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f6cfefc-4134-42d8-91fe-ebae0bc56801-scripts\") pod \"ovn-controller-56wgh-config-dghqz\" (UID: \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\") " pod="openstack/ovn-controller-56wgh-config-dghqz" Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.666636 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3f6cfefc-4134-42d8-91fe-ebae0bc56801-var-run\") pod \"ovn-controller-56wgh-config-dghqz\" (UID: \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\") " pod="openstack/ovn-controller-56wgh-config-dghqz" Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.666686 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3f6cfefc-4134-42d8-91fe-ebae0bc56801-additional-scripts\") pod \"ovn-controller-56wgh-config-dghqz\" (UID: \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\") " pod="openstack/ovn-controller-56wgh-config-dghqz" Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.666731 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb79v\" (UniqueName: \"kubernetes.io/projected/3f6cfefc-4134-42d8-91fe-ebae0bc56801-kube-api-access-tb79v\") pod \"ovn-controller-56wgh-config-dghqz\" (UID: \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\") " pod="openstack/ovn-controller-56wgh-config-dghqz" Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.666756 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3f6cfefc-4134-42d8-91fe-ebae0bc56801-var-run-ovn\") pod \"ovn-controller-56wgh-config-dghqz\" (UID: \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\") " pod="openstack/ovn-controller-56wgh-config-dghqz" Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.666992 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3f6cfefc-4134-42d8-91fe-ebae0bc56801-var-run-ovn\") pod \"ovn-controller-56wgh-config-dghqz\" (UID: \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\") " pod="openstack/ovn-controller-56wgh-config-dghqz" Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.667044 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3f6cfefc-4134-42d8-91fe-ebae0bc56801-var-log-ovn\") pod \"ovn-controller-56wgh-config-dghqz\" (UID: \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\") " pod="openstack/ovn-controller-56wgh-config-dghqz" Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.668787 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f6cfefc-4134-42d8-91fe-ebae0bc56801-scripts\") pod \"ovn-controller-56wgh-config-dghqz\" (UID: \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\") " pod="openstack/ovn-controller-56wgh-config-dghqz" Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.668851 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3f6cfefc-4134-42d8-91fe-ebae0bc56801-var-run\") pod \"ovn-controller-56wgh-config-dghqz\" (UID: \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\") " pod="openstack/ovn-controller-56wgh-config-dghqz" Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.669229 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3f6cfefc-4134-42d8-91fe-ebae0bc56801-additional-scripts\") pod \"ovn-controller-56wgh-config-dghqz\" (UID: \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\") " pod="openstack/ovn-controller-56wgh-config-dghqz" Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.692189 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb79v\" (UniqueName: \"kubernetes.io/projected/3f6cfefc-4134-42d8-91fe-ebae0bc56801-kube-api-access-tb79v\") pod \"ovn-controller-56wgh-config-dghqz\" (UID: \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\") " pod="openstack/ovn-controller-56wgh-config-dghqz" Sep 30 07:49:45 crc kubenswrapper[4760]: I0930 07:49:45.831628 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-56wgh-config-dghqz" Sep 30 07:49:46 crc kubenswrapper[4760]: I0930 07:49:46.340741 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-56wgh-config-dghqz"] Sep 30 07:49:46 crc kubenswrapper[4760]: I0930 07:49:46.430233 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9784a9e1-42ad-4f0b-ae43-a3227158a763","Type":"ContainerStarted","Data":"ac43d8db1c046801aea5d4bff64433379ff5d6e2825e451966ef62e72f5121df"} Sep 30 07:49:46 crc kubenswrapper[4760]: I0930 07:49:46.440660 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4f0b34-3c4a-4c78-b284-5959e91b00c0","Type":"ContainerStarted","Data":"ad5810a18a0f30439b20eeab0df186acf212b198caf0b9d24b4a735b02dfa500"} Sep 30 07:49:46 crc kubenswrapper[4760]: I0930 07:49:46.440700 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4f0b34-3c4a-4c78-b284-5959e91b00c0","Type":"ContainerStarted","Data":"8a6ededb1836687762cdda5fd62f27a592d09da2d6b165a61e3ccdd4c01bc78d"} Sep 30 07:49:46 crc kubenswrapper[4760]: I0930 07:49:46.440709 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4f0b34-3c4a-4c78-b284-5959e91b00c0","Type":"ContainerStarted","Data":"cd3e6510c573aef92c1777a430cddd3a6c2e898d59ef499242fa5624caf04aa5"} Sep 30 07:49:46 crc kubenswrapper[4760]: I0930 07:49:46.442180 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-56wgh-config-dghqz" event={"ID":"3f6cfefc-4134-42d8-91fe-ebae0bc56801","Type":"ContainerStarted","Data":"e47ad7d253cfff8644fd899f317150c1350bde8f32f27172a68284cd6594a7f7"} Sep 30 07:49:48 crc kubenswrapper[4760]: I0930 07:49:48.478284 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4f0b34-3c4a-4c78-b284-5959e91b00c0","Type":"ContainerStarted","Data":"7486056c5d121baeef58bf8e955d9be24639b7a96ab631ab2c95503d72e1823b"} Sep 30 07:49:48 crc kubenswrapper[4760]: I0930 07:49:48.482851 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-56wgh-config-dghqz" event={"ID":"3f6cfefc-4134-42d8-91fe-ebae0bc56801","Type":"ContainerStarted","Data":"f80d5f35ff3c46d2c909cd75de71a5cd1d0f5af795b416a56386816baed4e83a"} Sep 30 07:49:48 crc kubenswrapper[4760]: I0930 07:49:48.503930 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-56wgh-config-dghqz" podStartSLOduration=3.503907518 podStartE2EDuration="3.503907518s" podCreationTimestamp="2025-09-30 07:49:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:49:48.498979202 +0000 UTC m=+974.141885624" watchObservedRunningTime="2025-09-30 07:49:48.503907518 +0000 UTC m=+974.146813970" Sep 30 07:49:49 crc kubenswrapper[4760]: I0930 07:49:49.493208 4760 generic.go:334] "Generic (PLEG): container finished" podID="3f6cfefc-4134-42d8-91fe-ebae0bc56801" containerID="f80d5f35ff3c46d2c909cd75de71a5cd1d0f5af795b416a56386816baed4e83a" exitCode=0 Sep 30 07:49:49 crc kubenswrapper[4760]: I0930 07:49:49.493287 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-56wgh-config-dghqz" event={"ID":"3f6cfefc-4134-42d8-91fe-ebae0bc56801","Type":"ContainerDied","Data":"f80d5f35ff3c46d2c909cd75de71a5cd1d0f5af795b416a56386816baed4e83a"} Sep 30 07:49:49 crc kubenswrapper[4760]: I0930 07:49:49.996152 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-439d-account-create-n6z9w"] Sep 30 07:49:49 crc kubenswrapper[4760]: I0930 07:49:49.998469 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-439d-account-create-n6z9w" Sep 30 07:49:50 crc kubenswrapper[4760]: I0930 07:49:50.002649 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Sep 30 07:49:50 crc kubenswrapper[4760]: I0930 07:49:50.009636 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-439d-account-create-n6z9w"] Sep 30 07:49:50 crc kubenswrapper[4760]: I0930 07:49:50.051007 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-56wgh" Sep 30 07:49:50 crc kubenswrapper[4760]: I0930 07:49:50.055996 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57qsz\" (UniqueName: \"kubernetes.io/projected/b41f3bf7-e274-44ad-b745-c723e31a5167-kube-api-access-57qsz\") pod \"watcher-439d-account-create-n6z9w\" (UID: \"b41f3bf7-e274-44ad-b745-c723e31a5167\") " pod="openstack/watcher-439d-account-create-n6z9w" Sep 30 07:49:50 crc kubenswrapper[4760]: I0930 07:49:50.157249 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57qsz\" (UniqueName: \"kubernetes.io/projected/b41f3bf7-e274-44ad-b745-c723e31a5167-kube-api-access-57qsz\") pod \"watcher-439d-account-create-n6z9w\" (UID: \"b41f3bf7-e274-44ad-b745-c723e31a5167\") " pod="openstack/watcher-439d-account-create-n6z9w" Sep 30 07:49:50 crc kubenswrapper[4760]: I0930 07:49:50.174615 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57qsz\" (UniqueName: \"kubernetes.io/projected/b41f3bf7-e274-44ad-b745-c723e31a5167-kube-api-access-57qsz\") pod \"watcher-439d-account-create-n6z9w\" (UID: \"b41f3bf7-e274-44ad-b745-c723e31a5167\") " pod="openstack/watcher-439d-account-create-n6z9w" Sep 30 07:49:50 crc kubenswrapper[4760]: I0930 07:49:50.316002 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-439d-account-create-n6z9w" Sep 30 07:49:50 crc kubenswrapper[4760]: I0930 07:49:50.952917 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-56wgh-config-dghqz" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.073915 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3f6cfefc-4134-42d8-91fe-ebae0bc56801-var-log-ovn\") pod \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\" (UID: \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\") " Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.074030 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f6cfefc-4134-42d8-91fe-ebae0bc56801-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "3f6cfefc-4134-42d8-91fe-ebae0bc56801" (UID: "3f6cfefc-4134-42d8-91fe-ebae0bc56801"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.074095 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3f6cfefc-4134-42d8-91fe-ebae0bc56801-additional-scripts\") pod \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\" (UID: \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\") " Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.074143 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3f6cfefc-4134-42d8-91fe-ebae0bc56801-var-run-ovn\") pod \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\" (UID: \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\") " Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.074275 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb79v\" (UniqueName: \"kubernetes.io/projected/3f6cfefc-4134-42d8-91fe-ebae0bc56801-kube-api-access-tb79v\") pod \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\" (UID: \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\") " Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.074346 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3f6cfefc-4134-42d8-91fe-ebae0bc56801-var-run\") pod \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\" (UID: \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\") " Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.074413 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f6cfefc-4134-42d8-91fe-ebae0bc56801-scripts\") pod \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\" (UID: \"3f6cfefc-4134-42d8-91fe-ebae0bc56801\") " Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.074521 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f6cfefc-4134-42d8-91fe-ebae0bc56801-var-run" (OuterVolumeSpecName: "var-run") pod "3f6cfefc-4134-42d8-91fe-ebae0bc56801" (UID: "3f6cfefc-4134-42d8-91fe-ebae0bc56801"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.074631 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f6cfefc-4134-42d8-91fe-ebae0bc56801-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "3f6cfefc-4134-42d8-91fe-ebae0bc56801" (UID: "3f6cfefc-4134-42d8-91fe-ebae0bc56801"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.075030 4760 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3f6cfefc-4134-42d8-91fe-ebae0bc56801-var-log-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.075054 4760 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3f6cfefc-4134-42d8-91fe-ebae0bc56801-var-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.075075 4760 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3f6cfefc-4134-42d8-91fe-ebae0bc56801-var-run\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.075321 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f6cfefc-4134-42d8-91fe-ebae0bc56801-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "3f6cfefc-4134-42d8-91fe-ebae0bc56801" (UID: "3f6cfefc-4134-42d8-91fe-ebae0bc56801"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.075723 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f6cfefc-4134-42d8-91fe-ebae0bc56801-scripts" (OuterVolumeSpecName: "scripts") pod "3f6cfefc-4134-42d8-91fe-ebae0bc56801" (UID: "3f6cfefc-4134-42d8-91fe-ebae0bc56801"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.081726 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f6cfefc-4134-42d8-91fe-ebae0bc56801-kube-api-access-tb79v" (OuterVolumeSpecName: "kube-api-access-tb79v") pod "3f6cfefc-4134-42d8-91fe-ebae0bc56801" (UID: "3f6cfefc-4134-42d8-91fe-ebae0bc56801"). InnerVolumeSpecName "kube-api-access-tb79v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:49:51 crc kubenswrapper[4760]: W0930 07:49:51.110836 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb41f3bf7_e274_44ad_b745_c723e31a5167.slice/crio-0ec941207e056a6b1dd8f6f36865b49c45b5667d65b481ec1827498825cb1254 WatchSource:0}: Error finding container 0ec941207e056a6b1dd8f6f36865b49c45b5667d65b481ec1827498825cb1254: Status 404 returned error can't find the container with id 0ec941207e056a6b1dd8f6f36865b49c45b5667d65b481ec1827498825cb1254 Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.114279 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-439d-account-create-n6z9w"] Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.177220 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f6cfefc-4134-42d8-91fe-ebae0bc56801-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.177792 4760 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3f6cfefc-4134-42d8-91fe-ebae0bc56801-additional-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.177839 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb79v\" (UniqueName: \"kubernetes.io/projected/3f6cfefc-4134-42d8-91fe-ebae0bc56801-kube-api-access-tb79v\") on node \"crc\" DevicePath \"\"" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.525567 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-56wgh-config-dghqz" event={"ID":"3f6cfefc-4134-42d8-91fe-ebae0bc56801","Type":"ContainerDied","Data":"e47ad7d253cfff8644fd899f317150c1350bde8f32f27172a68284cd6594a7f7"} Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.525646 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e47ad7d253cfff8644fd899f317150c1350bde8f32f27172a68284cd6594a7f7" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.525771 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-56wgh-config-dghqz" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.529461 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-439d-account-create-n6z9w" event={"ID":"b41f3bf7-e274-44ad-b745-c723e31a5167","Type":"ContainerStarted","Data":"0ec941207e056a6b1dd8f6f36865b49c45b5667d65b481ec1827498825cb1254"} Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.616535 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-56wgh-config-dghqz"] Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.624872 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-56wgh-config-dghqz"] Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.721035 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-56wgh-config-8ltt7"] Sep 30 07:49:51 crc kubenswrapper[4760]: E0930 07:49:51.721481 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f6cfefc-4134-42d8-91fe-ebae0bc56801" containerName="ovn-config" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.721503 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f6cfefc-4134-42d8-91fe-ebae0bc56801" containerName="ovn-config" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.721689 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f6cfefc-4134-42d8-91fe-ebae0bc56801" containerName="ovn-config" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.722416 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-56wgh-config-8ltt7" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.727752 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.728898 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-56wgh-config-8ltt7"] Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.786918 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/385cee6a-8dab-47e6-babe-18b0490d5398-var-log-ovn\") pod \"ovn-controller-56wgh-config-8ltt7\" (UID: \"385cee6a-8dab-47e6-babe-18b0490d5398\") " pod="openstack/ovn-controller-56wgh-config-8ltt7" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.786996 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/385cee6a-8dab-47e6-babe-18b0490d5398-var-run-ovn\") pod \"ovn-controller-56wgh-config-8ltt7\" (UID: \"385cee6a-8dab-47e6-babe-18b0490d5398\") " pod="openstack/ovn-controller-56wgh-config-8ltt7" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.787024 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/385cee6a-8dab-47e6-babe-18b0490d5398-additional-scripts\") pod \"ovn-controller-56wgh-config-8ltt7\" (UID: \"385cee6a-8dab-47e6-babe-18b0490d5398\") " pod="openstack/ovn-controller-56wgh-config-8ltt7" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.787057 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/385cee6a-8dab-47e6-babe-18b0490d5398-scripts\") pod \"ovn-controller-56wgh-config-8ltt7\" (UID: \"385cee6a-8dab-47e6-babe-18b0490d5398\") " pod="openstack/ovn-controller-56wgh-config-8ltt7" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.787111 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlcxw\" (UniqueName: \"kubernetes.io/projected/385cee6a-8dab-47e6-babe-18b0490d5398-kube-api-access-xlcxw\") pod \"ovn-controller-56wgh-config-8ltt7\" (UID: \"385cee6a-8dab-47e6-babe-18b0490d5398\") " pod="openstack/ovn-controller-56wgh-config-8ltt7" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.787137 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/385cee6a-8dab-47e6-babe-18b0490d5398-var-run\") pod \"ovn-controller-56wgh-config-8ltt7\" (UID: \"385cee6a-8dab-47e6-babe-18b0490d5398\") " pod="openstack/ovn-controller-56wgh-config-8ltt7" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.889221 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/385cee6a-8dab-47e6-babe-18b0490d5398-var-log-ovn\") pod \"ovn-controller-56wgh-config-8ltt7\" (UID: \"385cee6a-8dab-47e6-babe-18b0490d5398\") " pod="openstack/ovn-controller-56wgh-config-8ltt7" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.889357 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/385cee6a-8dab-47e6-babe-18b0490d5398-var-run-ovn\") pod \"ovn-controller-56wgh-config-8ltt7\" (UID: \"385cee6a-8dab-47e6-babe-18b0490d5398\") " pod="openstack/ovn-controller-56wgh-config-8ltt7" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.889394 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/385cee6a-8dab-47e6-babe-18b0490d5398-additional-scripts\") pod \"ovn-controller-56wgh-config-8ltt7\" (UID: \"385cee6a-8dab-47e6-babe-18b0490d5398\") " pod="openstack/ovn-controller-56wgh-config-8ltt7" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.889430 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/385cee6a-8dab-47e6-babe-18b0490d5398-scripts\") pod \"ovn-controller-56wgh-config-8ltt7\" (UID: \"385cee6a-8dab-47e6-babe-18b0490d5398\") " pod="openstack/ovn-controller-56wgh-config-8ltt7" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.889505 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlcxw\" (UniqueName: \"kubernetes.io/projected/385cee6a-8dab-47e6-babe-18b0490d5398-kube-api-access-xlcxw\") pod \"ovn-controller-56wgh-config-8ltt7\" (UID: \"385cee6a-8dab-47e6-babe-18b0490d5398\") " pod="openstack/ovn-controller-56wgh-config-8ltt7" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.889552 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/385cee6a-8dab-47e6-babe-18b0490d5398-var-run\") pod \"ovn-controller-56wgh-config-8ltt7\" (UID: \"385cee6a-8dab-47e6-babe-18b0490d5398\") " pod="openstack/ovn-controller-56wgh-config-8ltt7" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.889918 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/385cee6a-8dab-47e6-babe-18b0490d5398-var-run\") pod \"ovn-controller-56wgh-config-8ltt7\" (UID: \"385cee6a-8dab-47e6-babe-18b0490d5398\") " pod="openstack/ovn-controller-56wgh-config-8ltt7" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.891351 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/385cee6a-8dab-47e6-babe-18b0490d5398-additional-scripts\") pod \"ovn-controller-56wgh-config-8ltt7\" (UID: \"385cee6a-8dab-47e6-babe-18b0490d5398\") " pod="openstack/ovn-controller-56wgh-config-8ltt7" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.891434 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/385cee6a-8dab-47e6-babe-18b0490d5398-var-log-ovn\") pod \"ovn-controller-56wgh-config-8ltt7\" (UID: \"385cee6a-8dab-47e6-babe-18b0490d5398\") " pod="openstack/ovn-controller-56wgh-config-8ltt7" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.891447 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/385cee6a-8dab-47e6-babe-18b0490d5398-var-run-ovn\") pod \"ovn-controller-56wgh-config-8ltt7\" (UID: \"385cee6a-8dab-47e6-babe-18b0490d5398\") " pod="openstack/ovn-controller-56wgh-config-8ltt7" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.895220 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/385cee6a-8dab-47e6-babe-18b0490d5398-scripts\") pod \"ovn-controller-56wgh-config-8ltt7\" (UID: \"385cee6a-8dab-47e6-babe-18b0490d5398\") " pod="openstack/ovn-controller-56wgh-config-8ltt7" Sep 30 07:49:51 crc kubenswrapper[4760]: I0930 07:49:51.918988 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlcxw\" (UniqueName: \"kubernetes.io/projected/385cee6a-8dab-47e6-babe-18b0490d5398-kube-api-access-xlcxw\") pod \"ovn-controller-56wgh-config-8ltt7\" (UID: \"385cee6a-8dab-47e6-babe-18b0490d5398\") " pod="openstack/ovn-controller-56wgh-config-8ltt7" Sep 30 07:49:52 crc kubenswrapper[4760]: I0930 07:49:52.049380 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-56wgh-config-8ltt7" Sep 30 07:49:52 crc kubenswrapper[4760]: I0930 07:49:52.539347 4760 generic.go:334] "Generic (PLEG): container finished" podID="b41f3bf7-e274-44ad-b745-c723e31a5167" containerID="721af8f3bf9fcd4398eff01bb2f0400eb09be862644cfd2d14af12687ba93474" exitCode=0 Sep 30 07:49:52 crc kubenswrapper[4760]: I0930 07:49:52.539454 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-439d-account-create-n6z9w" event={"ID":"b41f3bf7-e274-44ad-b745-c723e31a5167","Type":"ContainerDied","Data":"721af8f3bf9fcd4398eff01bb2f0400eb09be862644cfd2d14af12687ba93474"} Sep 30 07:49:52 crc kubenswrapper[4760]: I0930 07:49:52.541252 4760 generic.go:334] "Generic (PLEG): container finished" podID="9784a9e1-42ad-4f0b-ae43-a3227158a763" containerID="ac43d8db1c046801aea5d4bff64433379ff5d6e2825e451966ef62e72f5121df" exitCode=0 Sep 30 07:49:52 crc kubenswrapper[4760]: I0930 07:49:52.541329 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9784a9e1-42ad-4f0b-ae43-a3227158a763","Type":"ContainerDied","Data":"ac43d8db1c046801aea5d4bff64433379ff5d6e2825e451966ef62e72f5121df"} Sep 30 07:49:52 crc kubenswrapper[4760]: I0930 07:49:52.549994 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4f0b34-3c4a-4c78-b284-5959e91b00c0","Type":"ContainerStarted","Data":"034b07718ac17a65efa22c1ba40eca1b374c9528738346616e42ce12771c0380"} Sep 30 07:49:52 crc kubenswrapper[4760]: I0930 07:49:52.591799 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=26.923760384 podStartE2EDuration="33.59177724s" podCreationTimestamp="2025-09-30 07:49:19 +0000 UTC" firstStartedPulling="2025-09-30 07:49:38.011483757 +0000 UTC m=+963.654390169" lastFinishedPulling="2025-09-30 07:49:44.679500613 +0000 UTC m=+970.322407025" observedRunningTime="2025-09-30 07:49:52.588819964 +0000 UTC m=+978.231726366" watchObservedRunningTime="2025-09-30 07:49:52.59177724 +0000 UTC m=+978.234683652" Sep 30 07:49:52 crc kubenswrapper[4760]: I0930 07:49:52.823935 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-qd6bn"] Sep 30 07:49:52 crc kubenswrapper[4760]: I0930 07:49:52.825162 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" Sep 30 07:49:52 crc kubenswrapper[4760]: I0930 07:49:52.827247 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Sep 30 07:49:52 crc kubenswrapper[4760]: I0930 07:49:52.839687 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-qd6bn"] Sep 30 07:49:52 crc kubenswrapper[4760]: I0930 07:49:52.906342 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7hl5\" (UniqueName: \"kubernetes.io/projected/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-kube-api-access-h7hl5\") pod \"dnsmasq-dns-6d5b6d6b67-qd6bn\" (UID: \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" Sep 30 07:49:52 crc kubenswrapper[4760]: I0930 07:49:52.906398 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-config\") pod \"dnsmasq-dns-6d5b6d6b67-qd6bn\" (UID: \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" Sep 30 07:49:52 crc kubenswrapper[4760]: I0930 07:49:52.906441 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-qd6bn\" (UID: \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" Sep 30 07:49:52 crc kubenswrapper[4760]: I0930 07:49:52.906595 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-qd6bn\" (UID: \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" Sep 30 07:49:52 crc kubenswrapper[4760]: I0930 07:49:52.906637 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-qd6bn\" (UID: \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" Sep 30 07:49:52 crc kubenswrapper[4760]: I0930 07:49:52.906735 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-qd6bn\" (UID: \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" Sep 30 07:49:53 crc kubenswrapper[4760]: I0930 07:49:53.008672 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7hl5\" (UniqueName: \"kubernetes.io/projected/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-kube-api-access-h7hl5\") pod \"dnsmasq-dns-6d5b6d6b67-qd6bn\" (UID: \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" Sep 30 07:49:53 crc kubenswrapper[4760]: I0930 07:49:53.008751 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-config\") pod \"dnsmasq-dns-6d5b6d6b67-qd6bn\" (UID: \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" Sep 30 07:49:53 crc kubenswrapper[4760]: I0930 07:49:53.008804 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-qd6bn\" (UID: \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" Sep 30 07:49:53 crc kubenswrapper[4760]: I0930 07:49:53.008838 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-qd6bn\" (UID: \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" Sep 30 07:49:53 crc kubenswrapper[4760]: I0930 07:49:53.008858 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-qd6bn\" (UID: \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" Sep 30 07:49:53 crc kubenswrapper[4760]: I0930 07:49:53.008888 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-qd6bn\" (UID: \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" Sep 30 07:49:53 crc kubenswrapper[4760]: I0930 07:49:53.009778 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-qd6bn\" (UID: \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" Sep 30 07:49:53 crc kubenswrapper[4760]: I0930 07:49:53.010683 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-config\") pod \"dnsmasq-dns-6d5b6d6b67-qd6bn\" (UID: \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" Sep 30 07:49:53 crc kubenswrapper[4760]: I0930 07:49:53.011352 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-qd6bn\" (UID: \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" Sep 30 07:49:53 crc kubenswrapper[4760]: I0930 07:49:53.012059 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-qd6bn\" (UID: \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" Sep 30 07:49:53 crc kubenswrapper[4760]: I0930 07:49:53.012414 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-qd6bn\" (UID: \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" Sep 30 07:49:53 crc kubenswrapper[4760]: I0930 07:49:53.032713 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7hl5\" (UniqueName: \"kubernetes.io/projected/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-kube-api-access-h7hl5\") pod \"dnsmasq-dns-6d5b6d6b67-qd6bn\" (UID: \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" Sep 30 07:49:53 crc kubenswrapper[4760]: I0930 07:49:53.077905 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f6cfefc-4134-42d8-91fe-ebae0bc56801" path="/var/lib/kubelet/pods/3f6cfefc-4134-42d8-91fe-ebae0bc56801/volumes" Sep 30 07:49:53 crc kubenswrapper[4760]: I0930 07:49:53.147689 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" Sep 30 07:49:55 crc kubenswrapper[4760]: I0930 07:49:55.202496 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:49:56 crc kubenswrapper[4760]: I0930 07:49:56.087583 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 30 07:49:56 crc kubenswrapper[4760]: E0930 07:49:56.709906 4760 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.201:50590->38.102.83.201:37703: write tcp 38.102.83.201:50590->38.102.83.201:37703: write: broken pipe Sep 30 07:49:56 crc kubenswrapper[4760]: I0930 07:49:56.993520 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-85j42"] Sep 30 07:49:56 crc kubenswrapper[4760]: I0930 07:49:56.994790 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-85j42" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.010325 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-85j42"] Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.080982 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-njd24"] Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.082506 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-njd24" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.085935 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5csr\" (UniqueName: \"kubernetes.io/projected/88a5e2ea-3a8b-4b24-a152-10d0811414c8-kube-api-access-w5csr\") pod \"cinder-db-create-85j42\" (UID: \"88a5e2ea-3a8b-4b24-a152-10d0811414c8\") " pod="openstack/cinder-db-create-85j42" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.094703 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-njd24"] Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.187187 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxksn\" (UniqueName: \"kubernetes.io/projected/7c7e9e65-e505-4c0e-bc41-53e420d499ef-kube-api-access-vxksn\") pod \"barbican-db-create-njd24\" (UID: \"7c7e9e65-e505-4c0e-bc41-53e420d499ef\") " pod="openstack/barbican-db-create-njd24" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.187439 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5csr\" (UniqueName: \"kubernetes.io/projected/88a5e2ea-3a8b-4b24-a152-10d0811414c8-kube-api-access-w5csr\") pod \"cinder-db-create-85j42\" (UID: \"88a5e2ea-3a8b-4b24-a152-10d0811414c8\") " pod="openstack/cinder-db-create-85j42" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.208471 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5csr\" (UniqueName: \"kubernetes.io/projected/88a5e2ea-3a8b-4b24-a152-10d0811414c8-kube-api-access-w5csr\") pod \"cinder-db-create-85j42\" (UID: \"88a5e2ea-3a8b-4b24-a152-10d0811414c8\") " pod="openstack/cinder-db-create-85j42" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.289278 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxksn\" (UniqueName: \"kubernetes.io/projected/7c7e9e65-e505-4c0e-bc41-53e420d499ef-kube-api-access-vxksn\") pod \"barbican-db-create-njd24\" (UID: \"7c7e9e65-e505-4c0e-bc41-53e420d499ef\") " pod="openstack/barbican-db-create-njd24" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.308779 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxksn\" (UniqueName: \"kubernetes.io/projected/7c7e9e65-e505-4c0e-bc41-53e420d499ef-kube-api-access-vxksn\") pod \"barbican-db-create-njd24\" (UID: \"7c7e9e65-e505-4c0e-bc41-53e420d499ef\") " pod="openstack/barbican-db-create-njd24" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.313844 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-85j42" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.351569 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-g2fkc"] Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.352877 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-g2fkc" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.377350 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-g2fkc"] Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.405825 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-njd24" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.420006 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-kxxsc"] Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.421146 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kxxsc" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.424460 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.424509 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-gx9h7" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.424699 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.424882 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.428834 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kxxsc"] Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.492676 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc12fba5-2742-45fd-b63c-51b3201acc0a-combined-ca-bundle\") pod \"keystone-db-sync-kxxsc\" (UID: \"fc12fba5-2742-45fd-b63c-51b3201acc0a\") " pod="openstack/keystone-db-sync-kxxsc" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.492760 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc12fba5-2742-45fd-b63c-51b3201acc0a-config-data\") pod \"keystone-db-sync-kxxsc\" (UID: \"fc12fba5-2742-45fd-b63c-51b3201acc0a\") " pod="openstack/keystone-db-sync-kxxsc" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.492836 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6sdd\" (UniqueName: \"kubernetes.io/projected/fc12fba5-2742-45fd-b63c-51b3201acc0a-kube-api-access-n6sdd\") pod \"keystone-db-sync-kxxsc\" (UID: \"fc12fba5-2742-45fd-b63c-51b3201acc0a\") " pod="openstack/keystone-db-sync-kxxsc" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.493062 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwntl\" (UniqueName: \"kubernetes.io/projected/131df663-7ab5-42b0-8d39-9633a47f5d4c-kube-api-access-xwntl\") pod \"neutron-db-create-g2fkc\" (UID: \"131df663-7ab5-42b0-8d39-9633a47f5d4c\") " pod="openstack/neutron-db-create-g2fkc" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.594219 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc12fba5-2742-45fd-b63c-51b3201acc0a-combined-ca-bundle\") pod \"keystone-db-sync-kxxsc\" (UID: \"fc12fba5-2742-45fd-b63c-51b3201acc0a\") " pod="openstack/keystone-db-sync-kxxsc" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.594261 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc12fba5-2742-45fd-b63c-51b3201acc0a-config-data\") pod \"keystone-db-sync-kxxsc\" (UID: \"fc12fba5-2742-45fd-b63c-51b3201acc0a\") " pod="openstack/keystone-db-sync-kxxsc" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.594313 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6sdd\" (UniqueName: \"kubernetes.io/projected/fc12fba5-2742-45fd-b63c-51b3201acc0a-kube-api-access-n6sdd\") pod \"keystone-db-sync-kxxsc\" (UID: \"fc12fba5-2742-45fd-b63c-51b3201acc0a\") " pod="openstack/keystone-db-sync-kxxsc" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.594361 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwntl\" (UniqueName: \"kubernetes.io/projected/131df663-7ab5-42b0-8d39-9633a47f5d4c-kube-api-access-xwntl\") pod \"neutron-db-create-g2fkc\" (UID: \"131df663-7ab5-42b0-8d39-9633a47f5d4c\") " pod="openstack/neutron-db-create-g2fkc" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.603329 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc12fba5-2742-45fd-b63c-51b3201acc0a-combined-ca-bundle\") pod \"keystone-db-sync-kxxsc\" (UID: \"fc12fba5-2742-45fd-b63c-51b3201acc0a\") " pod="openstack/keystone-db-sync-kxxsc" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.604666 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc12fba5-2742-45fd-b63c-51b3201acc0a-config-data\") pod \"keystone-db-sync-kxxsc\" (UID: \"fc12fba5-2742-45fd-b63c-51b3201acc0a\") " pod="openstack/keystone-db-sync-kxxsc" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.611798 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwntl\" (UniqueName: \"kubernetes.io/projected/131df663-7ab5-42b0-8d39-9633a47f5d4c-kube-api-access-xwntl\") pod \"neutron-db-create-g2fkc\" (UID: \"131df663-7ab5-42b0-8d39-9633a47f5d4c\") " pod="openstack/neutron-db-create-g2fkc" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.623396 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6sdd\" (UniqueName: \"kubernetes.io/projected/fc12fba5-2742-45fd-b63c-51b3201acc0a-kube-api-access-n6sdd\") pod \"keystone-db-sync-kxxsc\" (UID: \"fc12fba5-2742-45fd-b63c-51b3201acc0a\") " pod="openstack/keystone-db-sync-kxxsc" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.687962 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-g2fkc" Sep 30 07:49:57 crc kubenswrapper[4760]: I0930 07:49:57.736702 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kxxsc" Sep 30 07:49:59 crc kubenswrapper[4760]: E0930 07:49:59.553943 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Sep 30 07:49:59 crc kubenswrapper[4760]: E0930 07:49:59.555365 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dwqgz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-s45t8_openstack(09c9e208-48cc-44b8-9810-fc0cf69cea8a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 07:49:59 crc kubenswrapper[4760]: E0930 07:49:59.556548 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-s45t8" podUID="09c9e208-48cc-44b8-9810-fc0cf69cea8a" Sep 30 07:49:59 crc kubenswrapper[4760]: I0930 07:49:59.638663 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-439d-account-create-n6z9w" event={"ID":"b41f3bf7-e274-44ad-b745-c723e31a5167","Type":"ContainerDied","Data":"0ec941207e056a6b1dd8f6f36865b49c45b5667d65b481ec1827498825cb1254"} Sep 30 07:49:59 crc kubenswrapper[4760]: I0930 07:49:59.638760 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ec941207e056a6b1dd8f6f36865b49c45b5667d65b481ec1827498825cb1254" Sep 30 07:49:59 crc kubenswrapper[4760]: E0930 07:49:59.668257 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-s45t8" podUID="09c9e208-48cc-44b8-9810-fc0cf69cea8a" Sep 30 07:49:59 crc kubenswrapper[4760]: I0930 07:49:59.755563 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-439d-account-create-n6z9w" Sep 30 07:49:59 crc kubenswrapper[4760]: I0930 07:49:59.830801 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57qsz\" (UniqueName: \"kubernetes.io/projected/b41f3bf7-e274-44ad-b745-c723e31a5167-kube-api-access-57qsz\") pod \"b41f3bf7-e274-44ad-b745-c723e31a5167\" (UID: \"b41f3bf7-e274-44ad-b745-c723e31a5167\") " Sep 30 07:49:59 crc kubenswrapper[4760]: I0930 07:49:59.845653 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b41f3bf7-e274-44ad-b745-c723e31a5167-kube-api-access-57qsz" (OuterVolumeSpecName: "kube-api-access-57qsz") pod "b41f3bf7-e274-44ad-b745-c723e31a5167" (UID: "b41f3bf7-e274-44ad-b745-c723e31a5167"). InnerVolumeSpecName "kube-api-access-57qsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:49:59 crc kubenswrapper[4760]: I0930 07:49:59.933088 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57qsz\" (UniqueName: \"kubernetes.io/projected/b41f3bf7-e274-44ad-b745-c723e31a5167-kube-api-access-57qsz\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:00 crc kubenswrapper[4760]: I0930 07:50:00.089666 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-njd24"] Sep 30 07:50:00 crc kubenswrapper[4760]: W0930 07:50:00.098662 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c7e9e65_e505_4c0e_bc41_53e420d499ef.slice/crio-87d75d0ea022674f3a3ae8316d90c638bef51b8bf99145d92fef909339314b2a WatchSource:0}: Error finding container 87d75d0ea022674f3a3ae8316d90c638bef51b8bf99145d92fef909339314b2a: Status 404 returned error can't find the container with id 87d75d0ea022674f3a3ae8316d90c638bef51b8bf99145d92fef909339314b2a Sep 30 07:50:00 crc kubenswrapper[4760]: I0930 07:50:00.314125 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kxxsc"] Sep 30 07:50:00 crc kubenswrapper[4760]: I0930 07:50:00.321446 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-85j42"] Sep 30 07:50:00 crc kubenswrapper[4760]: W0930 07:50:00.328685 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc12fba5_2742_45fd_b63c_51b3201acc0a.slice/crio-d0691f040f7c1d487120ac1f6d78d05f52b97eaa9be4df1089ca344492f5b342 WatchSource:0}: Error finding container d0691f040f7c1d487120ac1f6d78d05f52b97eaa9be4df1089ca344492f5b342: Status 404 returned error can't find the container with id d0691f040f7c1d487120ac1f6d78d05f52b97eaa9be4df1089ca344492f5b342 Sep 30 07:50:00 crc kubenswrapper[4760]: I0930 07:50:00.330531 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-g2fkc"] Sep 30 07:50:00 crc kubenswrapper[4760]: W0930 07:50:00.380829 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod131df663_7ab5_42b0_8d39_9633a47f5d4c.slice/crio-1af5fc42d1190cbf17eb2fb7e4ff5352749866dc2f89581a2058b785db2daeda WatchSource:0}: Error finding container 1af5fc42d1190cbf17eb2fb7e4ff5352749866dc2f89581a2058b785db2daeda: Status 404 returned error can't find the container with id 1af5fc42d1190cbf17eb2fb7e4ff5352749866dc2f89581a2058b785db2daeda Sep 30 07:50:00 crc kubenswrapper[4760]: W0930 07:50:00.383057 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88a5e2ea_3a8b_4b24_a152_10d0811414c8.slice/crio-9e43c31035fc93c6618ff54a64b02bedafaac0dea40b2f1b01358f85b95fe6d0 WatchSource:0}: Error finding container 9e43c31035fc93c6618ff54a64b02bedafaac0dea40b2f1b01358f85b95fe6d0: Status 404 returned error can't find the container with id 9e43c31035fc93c6618ff54a64b02bedafaac0dea40b2f1b01358f85b95fe6d0 Sep 30 07:50:00 crc kubenswrapper[4760]: I0930 07:50:00.472884 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-56wgh-config-8ltt7"] Sep 30 07:50:00 crc kubenswrapper[4760]: W0930 07:50:00.477275 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod385cee6a_8dab_47e6_babe_18b0490d5398.slice/crio-be9a49f1d4b1a5c41ecfaa834b3a77e4b6e54f560b6f556e57acd97f6fc8b70d WatchSource:0}: Error finding container be9a49f1d4b1a5c41ecfaa834b3a77e4b6e54f560b6f556e57acd97f6fc8b70d: Status 404 returned error can't find the container with id be9a49f1d4b1a5c41ecfaa834b3a77e4b6e54f560b6f556e57acd97f6fc8b70d Sep 30 07:50:00 crc kubenswrapper[4760]: I0930 07:50:00.486354 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-qd6bn"] Sep 30 07:50:00 crc kubenswrapper[4760]: W0930 07:50:00.510157 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda39adebc_2e75_44a7_b8e2_edcd9631e8d3.slice/crio-59e1bf1f8abe56bf050028140d1422871705417cfcb19caa27ba2e0bd051d36e WatchSource:0}: Error finding container 59e1bf1f8abe56bf050028140d1422871705417cfcb19caa27ba2e0bd051d36e: Status 404 returned error can't find the container with id 59e1bf1f8abe56bf050028140d1422871705417cfcb19caa27ba2e0bd051d36e Sep 30 07:50:00 crc kubenswrapper[4760]: I0930 07:50:00.673226 4760 generic.go:334] "Generic (PLEG): container finished" podID="88a5e2ea-3a8b-4b24-a152-10d0811414c8" containerID="8579a4d68d22e9756e93b14f9362981dd05e3e81b78de136aeada899e94d83b4" exitCode=0 Sep 30 07:50:00 crc kubenswrapper[4760]: I0930 07:50:00.673322 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-85j42" event={"ID":"88a5e2ea-3a8b-4b24-a152-10d0811414c8","Type":"ContainerDied","Data":"8579a4d68d22e9756e93b14f9362981dd05e3e81b78de136aeada899e94d83b4"} Sep 30 07:50:00 crc kubenswrapper[4760]: I0930 07:50:00.673346 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-85j42" event={"ID":"88a5e2ea-3a8b-4b24-a152-10d0811414c8","Type":"ContainerStarted","Data":"9e43c31035fc93c6618ff54a64b02bedafaac0dea40b2f1b01358f85b95fe6d0"} Sep 30 07:50:00 crc kubenswrapper[4760]: I0930 07:50:00.677823 4760 generic.go:334] "Generic (PLEG): container finished" podID="7c7e9e65-e505-4c0e-bc41-53e420d499ef" containerID="be1a65183707097946e16a4218228c506f740928573e2b4f019b3143314b3217" exitCode=0 Sep 30 07:50:00 crc kubenswrapper[4760]: I0930 07:50:00.677865 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-njd24" event={"ID":"7c7e9e65-e505-4c0e-bc41-53e420d499ef","Type":"ContainerDied","Data":"be1a65183707097946e16a4218228c506f740928573e2b4f019b3143314b3217"} Sep 30 07:50:00 crc kubenswrapper[4760]: I0930 07:50:00.677885 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-njd24" event={"ID":"7c7e9e65-e505-4c0e-bc41-53e420d499ef","Type":"ContainerStarted","Data":"87d75d0ea022674f3a3ae8316d90c638bef51b8bf99145d92fef909339314b2a"} Sep 30 07:50:00 crc kubenswrapper[4760]: I0930 07:50:00.682721 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9784a9e1-42ad-4f0b-ae43-a3227158a763","Type":"ContainerStarted","Data":"63ec4e758f6a709110fe3d253fc67a8a6f598cba4026ccd8b9f1b87abaa502fe"} Sep 30 07:50:00 crc kubenswrapper[4760]: I0930 07:50:00.684357 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" event={"ID":"a39adebc-2e75-44a7-b8e2-edcd9631e8d3","Type":"ContainerStarted","Data":"59e1bf1f8abe56bf050028140d1422871705417cfcb19caa27ba2e0bd051d36e"} Sep 30 07:50:00 crc kubenswrapper[4760]: I0930 07:50:00.687616 4760 generic.go:334] "Generic (PLEG): container finished" podID="131df663-7ab5-42b0-8d39-9633a47f5d4c" containerID="17d3b399aae2293488f77d73df6ba2f7106ed3ea8b3dd4f3ad4cd312eff74859" exitCode=0 Sep 30 07:50:00 crc kubenswrapper[4760]: I0930 07:50:00.687672 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-g2fkc" event={"ID":"131df663-7ab5-42b0-8d39-9633a47f5d4c","Type":"ContainerDied","Data":"17d3b399aae2293488f77d73df6ba2f7106ed3ea8b3dd4f3ad4cd312eff74859"} Sep 30 07:50:00 crc kubenswrapper[4760]: I0930 07:50:00.687692 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-g2fkc" event={"ID":"131df663-7ab5-42b0-8d39-9633a47f5d4c","Type":"ContainerStarted","Data":"1af5fc42d1190cbf17eb2fb7e4ff5352749866dc2f89581a2058b785db2daeda"} Sep 30 07:50:00 crc kubenswrapper[4760]: I0930 07:50:00.688657 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-56wgh-config-8ltt7" event={"ID":"385cee6a-8dab-47e6-babe-18b0490d5398","Type":"ContainerStarted","Data":"be9a49f1d4b1a5c41ecfaa834b3a77e4b6e54f560b6f556e57acd97f6fc8b70d"} Sep 30 07:50:00 crc kubenswrapper[4760]: I0930 07:50:00.689779 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-439d-account-create-n6z9w" Sep 30 07:50:00 crc kubenswrapper[4760]: I0930 07:50:00.689795 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kxxsc" event={"ID":"fc12fba5-2742-45fd-b63c-51b3201acc0a","Type":"ContainerStarted","Data":"d0691f040f7c1d487120ac1f6d78d05f52b97eaa9be4df1089ca344492f5b342"} Sep 30 07:50:01 crc kubenswrapper[4760]: I0930 07:50:01.706747 4760 generic.go:334] "Generic (PLEG): container finished" podID="385cee6a-8dab-47e6-babe-18b0490d5398" containerID="36d6105918e0e3dba49282bea8e38b9720ce0cff8c55ddd63be3ab273613a242" exitCode=0 Sep 30 07:50:01 crc kubenswrapper[4760]: I0930 07:50:01.706850 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-56wgh-config-8ltt7" event={"ID":"385cee6a-8dab-47e6-babe-18b0490d5398","Type":"ContainerDied","Data":"36d6105918e0e3dba49282bea8e38b9720ce0cff8c55ddd63be3ab273613a242"} Sep 30 07:50:01 crc kubenswrapper[4760]: I0930 07:50:01.710039 4760 generic.go:334] "Generic (PLEG): container finished" podID="a39adebc-2e75-44a7-b8e2-edcd9631e8d3" containerID="11203f7700b1de680e69bf58197c15839c9ed92399361c6b6c1e024869ccd8c0" exitCode=0 Sep 30 07:50:01 crc kubenswrapper[4760]: I0930 07:50:01.710160 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" event={"ID":"a39adebc-2e75-44a7-b8e2-edcd9631e8d3","Type":"ContainerDied","Data":"11203f7700b1de680e69bf58197c15839c9ed92399361c6b6c1e024869ccd8c0"} Sep 30 07:50:02 crc kubenswrapper[4760]: I0930 07:50:02.721553 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" event={"ID":"a39adebc-2e75-44a7-b8e2-edcd9631e8d3","Type":"ContainerStarted","Data":"31453f880b5979ac04025f2ff809eb14cfb86d5d44a1f916d96f3fbcc9a7fc11"} Sep 30 07:50:02 crc kubenswrapper[4760]: I0930 07:50:02.722021 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" Sep 30 07:50:02 crc kubenswrapper[4760]: I0930 07:50:02.755827 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" podStartSLOduration=10.755806804 podStartE2EDuration="10.755806804s" podCreationTimestamp="2025-09-30 07:49:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:50:02.749485923 +0000 UTC m=+988.392392345" watchObservedRunningTime="2025-09-30 07:50:02.755806804 +0000 UTC m=+988.398713216" Sep 30 07:50:03 crc kubenswrapper[4760]: I0930 07:50:03.733005 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9784a9e1-42ad-4f0b-ae43-a3227158a763","Type":"ContainerStarted","Data":"ccd8af5e4644fe30831ecfe237486ca035bb69adba8cba7b97ba5bd89895aa0a"} Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.676138 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-g2fkc" Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.714999 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-56wgh-config-8ltt7" Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.753798 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-njd24" Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.822506 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-56wgh-config-8ltt7" Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.822510 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-56wgh-config-8ltt7" event={"ID":"385cee6a-8dab-47e6-babe-18b0490d5398","Type":"ContainerDied","Data":"be9a49f1d4b1a5c41ecfaa834b3a77e4b6e54f560b6f556e57acd97f6fc8b70d"} Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.822622 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be9a49f1d4b1a5c41ecfaa834b3a77e4b6e54f560b6f556e57acd97f6fc8b70d" Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.823624 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-85j42" event={"ID":"88a5e2ea-3a8b-4b24-a152-10d0811414c8","Type":"ContainerDied","Data":"9e43c31035fc93c6618ff54a64b02bedafaac0dea40b2f1b01358f85b95fe6d0"} Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.823661 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e43c31035fc93c6618ff54a64b02bedafaac0dea40b2f1b01358f85b95fe6d0" Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.824115 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/385cee6a-8dab-47e6-babe-18b0490d5398-var-run\") pod \"385cee6a-8dab-47e6-babe-18b0490d5398\" (UID: \"385cee6a-8dab-47e6-babe-18b0490d5398\") " Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.824170 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/385cee6a-8dab-47e6-babe-18b0490d5398-var-run-ovn\") pod \"385cee6a-8dab-47e6-babe-18b0490d5398\" (UID: \"385cee6a-8dab-47e6-babe-18b0490d5398\") " Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.824201 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxksn\" (UniqueName: \"kubernetes.io/projected/7c7e9e65-e505-4c0e-bc41-53e420d499ef-kube-api-access-vxksn\") pod \"7c7e9e65-e505-4c0e-bc41-53e420d499ef\" (UID: \"7c7e9e65-e505-4c0e-bc41-53e420d499ef\") " Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.824217 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwntl\" (UniqueName: \"kubernetes.io/projected/131df663-7ab5-42b0-8d39-9633a47f5d4c-kube-api-access-xwntl\") pod \"131df663-7ab5-42b0-8d39-9633a47f5d4c\" (UID: \"131df663-7ab5-42b0-8d39-9633a47f5d4c\") " Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.824586 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlcxw\" (UniqueName: \"kubernetes.io/projected/385cee6a-8dab-47e6-babe-18b0490d5398-kube-api-access-xlcxw\") pod \"385cee6a-8dab-47e6-babe-18b0490d5398\" (UID: \"385cee6a-8dab-47e6-babe-18b0490d5398\") " Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.824614 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/385cee6a-8dab-47e6-babe-18b0490d5398-var-log-ovn\") pod \"385cee6a-8dab-47e6-babe-18b0490d5398\" (UID: \"385cee6a-8dab-47e6-babe-18b0490d5398\") " Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.824729 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/385cee6a-8dab-47e6-babe-18b0490d5398-scripts\") pod \"385cee6a-8dab-47e6-babe-18b0490d5398\" (UID: \"385cee6a-8dab-47e6-babe-18b0490d5398\") " Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.824778 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/385cee6a-8dab-47e6-babe-18b0490d5398-additional-scripts\") pod \"385cee6a-8dab-47e6-babe-18b0490d5398\" (UID: \"385cee6a-8dab-47e6-babe-18b0490d5398\") " Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.826350 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/385cee6a-8dab-47e6-babe-18b0490d5398-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "385cee6a-8dab-47e6-babe-18b0490d5398" (UID: "385cee6a-8dab-47e6-babe-18b0490d5398"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.828090 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/385cee6a-8dab-47e6-babe-18b0490d5398-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "385cee6a-8dab-47e6-babe-18b0490d5398" (UID: "385cee6a-8dab-47e6-babe-18b0490d5398"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.829249 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/385cee6a-8dab-47e6-babe-18b0490d5398-scripts" (OuterVolumeSpecName: "scripts") pod "385cee6a-8dab-47e6-babe-18b0490d5398" (UID: "385cee6a-8dab-47e6-babe-18b0490d5398"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.829281 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/385cee6a-8dab-47e6-babe-18b0490d5398-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "385cee6a-8dab-47e6-babe-18b0490d5398" (UID: "385cee6a-8dab-47e6-babe-18b0490d5398"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.829312 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/385cee6a-8dab-47e6-babe-18b0490d5398-var-run" (OuterVolumeSpecName: "var-run") pod "385cee6a-8dab-47e6-babe-18b0490d5398" (UID: "385cee6a-8dab-47e6-babe-18b0490d5398"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.830945 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-85j42" Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.831185 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-njd24" event={"ID":"7c7e9e65-e505-4c0e-bc41-53e420d499ef","Type":"ContainerDied","Data":"87d75d0ea022674f3a3ae8316d90c638bef51b8bf99145d92fef909339314b2a"} Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.831215 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87d75d0ea022674f3a3ae8316d90c638bef51b8bf99145d92fef909339314b2a" Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.831282 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-njd24" Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.834408 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c7e9e65-e505-4c0e-bc41-53e420d499ef-kube-api-access-vxksn" (OuterVolumeSpecName: "kube-api-access-vxksn") pod "7c7e9e65-e505-4c0e-bc41-53e420d499ef" (UID: "7c7e9e65-e505-4c0e-bc41-53e420d499ef"). InnerVolumeSpecName "kube-api-access-vxksn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.835334 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-g2fkc" Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.835417 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-g2fkc" event={"ID":"131df663-7ab5-42b0-8d39-9633a47f5d4c","Type":"ContainerDied","Data":"1af5fc42d1190cbf17eb2fb7e4ff5352749866dc2f89581a2058b785db2daeda"} Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.835440 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1af5fc42d1190cbf17eb2fb7e4ff5352749866dc2f89581a2058b785db2daeda" Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.835597 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/131df663-7ab5-42b0-8d39-9633a47f5d4c-kube-api-access-xwntl" (OuterVolumeSpecName: "kube-api-access-xwntl") pod "131df663-7ab5-42b0-8d39-9633a47f5d4c" (UID: "131df663-7ab5-42b0-8d39-9633a47f5d4c"). InnerVolumeSpecName "kube-api-access-xwntl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.836011 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/385cee6a-8dab-47e6-babe-18b0490d5398-kube-api-access-xlcxw" (OuterVolumeSpecName: "kube-api-access-xlcxw") pod "385cee6a-8dab-47e6-babe-18b0490d5398" (UID: "385cee6a-8dab-47e6-babe-18b0490d5398"). InnerVolumeSpecName "kube-api-access-xlcxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.926021 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5csr\" (UniqueName: \"kubernetes.io/projected/88a5e2ea-3a8b-4b24-a152-10d0811414c8-kube-api-access-w5csr\") pod \"88a5e2ea-3a8b-4b24-a152-10d0811414c8\" (UID: \"88a5e2ea-3a8b-4b24-a152-10d0811414c8\") " Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.926719 4760 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/385cee6a-8dab-47e6-babe-18b0490d5398-additional-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.926738 4760 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/385cee6a-8dab-47e6-babe-18b0490d5398-var-run\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.926747 4760 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/385cee6a-8dab-47e6-babe-18b0490d5398-var-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.926756 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxksn\" (UniqueName: \"kubernetes.io/projected/7c7e9e65-e505-4c0e-bc41-53e420d499ef-kube-api-access-vxksn\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.926765 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwntl\" (UniqueName: \"kubernetes.io/projected/131df663-7ab5-42b0-8d39-9633a47f5d4c-kube-api-access-xwntl\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.926790 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlcxw\" (UniqueName: \"kubernetes.io/projected/385cee6a-8dab-47e6-babe-18b0490d5398-kube-api-access-xlcxw\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.926798 4760 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/385cee6a-8dab-47e6-babe-18b0490d5398-var-log-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.926805 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/385cee6a-8dab-47e6-babe-18b0490d5398-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:04 crc kubenswrapper[4760]: I0930 07:50:04.931115 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88a5e2ea-3a8b-4b24-a152-10d0811414c8-kube-api-access-w5csr" (OuterVolumeSpecName: "kube-api-access-w5csr") pod "88a5e2ea-3a8b-4b24-a152-10d0811414c8" (UID: "88a5e2ea-3a8b-4b24-a152-10d0811414c8"). InnerVolumeSpecName "kube-api-access-w5csr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.028719 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5csr\" (UniqueName: \"kubernetes.io/projected/88a5e2ea-3a8b-4b24-a152-10d0811414c8-kube-api-access-w5csr\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.270935 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-mpstv"] Sep 30 07:50:05 crc kubenswrapper[4760]: E0930 07:50:05.271254 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a5e2ea-3a8b-4b24-a152-10d0811414c8" containerName="mariadb-database-create" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.271268 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a5e2ea-3a8b-4b24-a152-10d0811414c8" containerName="mariadb-database-create" Sep 30 07:50:05 crc kubenswrapper[4760]: E0930 07:50:05.271284 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="385cee6a-8dab-47e6-babe-18b0490d5398" containerName="ovn-config" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.271289 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="385cee6a-8dab-47e6-babe-18b0490d5398" containerName="ovn-config" Sep 30 07:50:05 crc kubenswrapper[4760]: E0930 07:50:05.271326 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="131df663-7ab5-42b0-8d39-9633a47f5d4c" containerName="mariadb-database-create" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.271334 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="131df663-7ab5-42b0-8d39-9633a47f5d4c" containerName="mariadb-database-create" Sep 30 07:50:05 crc kubenswrapper[4760]: E0930 07:50:05.271345 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c7e9e65-e505-4c0e-bc41-53e420d499ef" containerName="mariadb-database-create" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.271351 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c7e9e65-e505-4c0e-bc41-53e420d499ef" containerName="mariadb-database-create" Sep 30 07:50:05 crc kubenswrapper[4760]: E0930 07:50:05.271363 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b41f3bf7-e274-44ad-b745-c723e31a5167" containerName="mariadb-account-create" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.271369 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="b41f3bf7-e274-44ad-b745-c723e31a5167" containerName="mariadb-account-create" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.271513 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="385cee6a-8dab-47e6-babe-18b0490d5398" containerName="ovn-config" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.271527 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="88a5e2ea-3a8b-4b24-a152-10d0811414c8" containerName="mariadb-database-create" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.271537 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c7e9e65-e505-4c0e-bc41-53e420d499ef" containerName="mariadb-database-create" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.271548 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="b41f3bf7-e274-44ad-b745-c723e31a5167" containerName="mariadb-account-create" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.271557 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="131df663-7ab5-42b0-8d39-9633a47f5d4c" containerName="mariadb-database-create" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.272066 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-mpstv" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.273954 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-blgm4" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.274072 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.282042 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-mpstv"] Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.332562 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2v22\" (UniqueName: \"kubernetes.io/projected/cf25d474-c105-4c8b-87ad-0911e245056f-kube-api-access-c2v22\") pod \"watcher-db-sync-mpstv\" (UID: \"cf25d474-c105-4c8b-87ad-0911e245056f\") " pod="openstack/watcher-db-sync-mpstv" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.332711 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf25d474-c105-4c8b-87ad-0911e245056f-config-data\") pod \"watcher-db-sync-mpstv\" (UID: \"cf25d474-c105-4c8b-87ad-0911e245056f\") " pod="openstack/watcher-db-sync-mpstv" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.332991 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf25d474-c105-4c8b-87ad-0911e245056f-combined-ca-bundle\") pod \"watcher-db-sync-mpstv\" (UID: \"cf25d474-c105-4c8b-87ad-0911e245056f\") " pod="openstack/watcher-db-sync-mpstv" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.333064 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cf25d474-c105-4c8b-87ad-0911e245056f-db-sync-config-data\") pod \"watcher-db-sync-mpstv\" (UID: \"cf25d474-c105-4c8b-87ad-0911e245056f\") " pod="openstack/watcher-db-sync-mpstv" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.434337 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2v22\" (UniqueName: \"kubernetes.io/projected/cf25d474-c105-4c8b-87ad-0911e245056f-kube-api-access-c2v22\") pod \"watcher-db-sync-mpstv\" (UID: \"cf25d474-c105-4c8b-87ad-0911e245056f\") " pod="openstack/watcher-db-sync-mpstv" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.434424 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf25d474-c105-4c8b-87ad-0911e245056f-config-data\") pod \"watcher-db-sync-mpstv\" (UID: \"cf25d474-c105-4c8b-87ad-0911e245056f\") " pod="openstack/watcher-db-sync-mpstv" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.434511 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf25d474-c105-4c8b-87ad-0911e245056f-combined-ca-bundle\") pod \"watcher-db-sync-mpstv\" (UID: \"cf25d474-c105-4c8b-87ad-0911e245056f\") " pod="openstack/watcher-db-sync-mpstv" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.434545 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cf25d474-c105-4c8b-87ad-0911e245056f-db-sync-config-data\") pod \"watcher-db-sync-mpstv\" (UID: \"cf25d474-c105-4c8b-87ad-0911e245056f\") " pod="openstack/watcher-db-sync-mpstv" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.439728 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf25d474-c105-4c8b-87ad-0911e245056f-config-data\") pod \"watcher-db-sync-mpstv\" (UID: \"cf25d474-c105-4c8b-87ad-0911e245056f\") " pod="openstack/watcher-db-sync-mpstv" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.439784 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cf25d474-c105-4c8b-87ad-0911e245056f-db-sync-config-data\") pod \"watcher-db-sync-mpstv\" (UID: \"cf25d474-c105-4c8b-87ad-0911e245056f\") " pod="openstack/watcher-db-sync-mpstv" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.440371 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf25d474-c105-4c8b-87ad-0911e245056f-combined-ca-bundle\") pod \"watcher-db-sync-mpstv\" (UID: \"cf25d474-c105-4c8b-87ad-0911e245056f\") " pod="openstack/watcher-db-sync-mpstv" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.453763 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2v22\" (UniqueName: \"kubernetes.io/projected/cf25d474-c105-4c8b-87ad-0911e245056f-kube-api-access-c2v22\") pod \"watcher-db-sync-mpstv\" (UID: \"cf25d474-c105-4c8b-87ad-0911e245056f\") " pod="openstack/watcher-db-sync-mpstv" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.590680 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-mpstv" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.828623 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-56wgh-config-8ltt7"] Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.838016 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-56wgh-config-8ltt7"] Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.843522 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kxxsc" event={"ID":"fc12fba5-2742-45fd-b63c-51b3201acc0a","Type":"ContainerStarted","Data":"8fde0ea6afbe64eb5fbab5f1ac40c61b6f6051c1561f673a2a9c5a166086ce08"} Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.846320 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-85j42" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.846901 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9784a9e1-42ad-4f0b-ae43-a3227158a763","Type":"ContainerStarted","Data":"a2b8b7aedb37924ff3f231ff7625453b8ca43df8219f8c517b32529fc171db6a"} Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.871512 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-kxxsc" podStartSLOduration=4.722749128 podStartE2EDuration="8.871492591s" podCreationTimestamp="2025-09-30 07:49:57 +0000 UTC" firstStartedPulling="2025-09-30 07:50:00.330884351 +0000 UTC m=+985.973790763" lastFinishedPulling="2025-09-30 07:50:04.479627814 +0000 UTC m=+990.122534226" observedRunningTime="2025-09-30 07:50:05.861162717 +0000 UTC m=+991.504069119" watchObservedRunningTime="2025-09-30 07:50:05.871492591 +0000 UTC m=+991.514399013" Sep 30 07:50:05 crc kubenswrapper[4760]: I0930 07:50:05.900018 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=24.899996267 podStartE2EDuration="24.899996267s" podCreationTimestamp="2025-09-30 07:49:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:50:05.895111843 +0000 UTC m=+991.538018265" watchObservedRunningTime="2025-09-30 07:50:05.899996267 +0000 UTC m=+991.542902679" Sep 30 07:50:06 crc kubenswrapper[4760]: I0930 07:50:06.125436 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-mpstv"] Sep 30 07:50:06 crc kubenswrapper[4760]: I0930 07:50:06.851431 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Sep 30 07:50:06 crc kubenswrapper[4760]: I0930 07:50:06.856375 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-mpstv" event={"ID":"cf25d474-c105-4c8b-87ad-0911e245056f","Type":"ContainerStarted","Data":"16cf11062dda106ae6ba38f1614f80fc9c1495636e515b480cd187686398716f"} Sep 30 07:50:07 crc kubenswrapper[4760]: I0930 07:50:07.079565 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="385cee6a-8dab-47e6-babe-18b0490d5398" path="/var/lib/kubelet/pods/385cee6a-8dab-47e6-babe-18b0490d5398/volumes" Sep 30 07:50:08 crc kubenswrapper[4760]: I0930 07:50:08.150504 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" Sep 30 07:50:08 crc kubenswrapper[4760]: I0930 07:50:08.202942 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-mgg5f"] Sep 30 07:50:08 crc kubenswrapper[4760]: I0930 07:50:08.203172 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" podUID="2fbea500-bbb8-4691-817f-e06a76aba29a" containerName="dnsmasq-dns" containerID="cri-o://31b22899c626cee9f680e860115414b973b37b3a83e8fb751b0242cdfb1e1aa2" gracePeriod=10 Sep 30 07:50:08 crc kubenswrapper[4760]: I0930 07:50:08.876281 4760 generic.go:334] "Generic (PLEG): container finished" podID="fc12fba5-2742-45fd-b63c-51b3201acc0a" containerID="8fde0ea6afbe64eb5fbab5f1ac40c61b6f6051c1561f673a2a9c5a166086ce08" exitCode=0 Sep 30 07:50:08 crc kubenswrapper[4760]: I0930 07:50:08.876482 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kxxsc" event={"ID":"fc12fba5-2742-45fd-b63c-51b3201acc0a","Type":"ContainerDied","Data":"8fde0ea6afbe64eb5fbab5f1ac40c61b6f6051c1561f673a2a9c5a166086ce08"} Sep 30 07:50:08 crc kubenswrapper[4760]: I0930 07:50:08.884042 4760 generic.go:334] "Generic (PLEG): container finished" podID="2fbea500-bbb8-4691-817f-e06a76aba29a" containerID="31b22899c626cee9f680e860115414b973b37b3a83e8fb751b0242cdfb1e1aa2" exitCode=0 Sep 30 07:50:08 crc kubenswrapper[4760]: I0930 07:50:08.884093 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" event={"ID":"2fbea500-bbb8-4691-817f-e06a76aba29a","Type":"ContainerDied","Data":"31b22899c626cee9f680e860115414b973b37b3a83e8fb751b0242cdfb1e1aa2"} Sep 30 07:50:10 crc kubenswrapper[4760]: I0930 07:50:10.172904 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" podUID="2fbea500-bbb8-4691-817f-e06a76aba29a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.122:5353: connect: connection refused" Sep 30 07:50:11 crc kubenswrapper[4760]: I0930 07:50:11.851903 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Sep 30 07:50:11 crc kubenswrapper[4760]: I0930 07:50:11.856132 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Sep 30 07:50:11 crc kubenswrapper[4760]: I0930 07:50:11.913669 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Sep 30 07:50:13 crc kubenswrapper[4760]: I0930 07:50:13.952969 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kxxsc" event={"ID":"fc12fba5-2742-45fd-b63c-51b3201acc0a","Type":"ContainerDied","Data":"d0691f040f7c1d487120ac1f6d78d05f52b97eaa9be4df1089ca344492f5b342"} Sep 30 07:50:13 crc kubenswrapper[4760]: I0930 07:50:13.953412 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0691f040f7c1d487120ac1f6d78d05f52b97eaa9be4df1089ca344492f5b342" Sep 30 07:50:13 crc kubenswrapper[4760]: I0930 07:50:13.956603 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kxxsc" Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.010952 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6sdd\" (UniqueName: \"kubernetes.io/projected/fc12fba5-2742-45fd-b63c-51b3201acc0a-kube-api-access-n6sdd\") pod \"fc12fba5-2742-45fd-b63c-51b3201acc0a\" (UID: \"fc12fba5-2742-45fd-b63c-51b3201acc0a\") " Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.011420 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc12fba5-2742-45fd-b63c-51b3201acc0a-combined-ca-bundle\") pod \"fc12fba5-2742-45fd-b63c-51b3201acc0a\" (UID: \"fc12fba5-2742-45fd-b63c-51b3201acc0a\") " Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.011545 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc12fba5-2742-45fd-b63c-51b3201acc0a-config-data\") pod \"fc12fba5-2742-45fd-b63c-51b3201acc0a\" (UID: \"fc12fba5-2742-45fd-b63c-51b3201acc0a\") " Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.017044 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc12fba5-2742-45fd-b63c-51b3201acc0a-kube-api-access-n6sdd" (OuterVolumeSpecName: "kube-api-access-n6sdd") pod "fc12fba5-2742-45fd-b63c-51b3201acc0a" (UID: "fc12fba5-2742-45fd-b63c-51b3201acc0a"). InnerVolumeSpecName "kube-api-access-n6sdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.071825 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc12fba5-2742-45fd-b63c-51b3201acc0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc12fba5-2742-45fd-b63c-51b3201acc0a" (UID: "fc12fba5-2742-45fd-b63c-51b3201acc0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.102441 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc12fba5-2742-45fd-b63c-51b3201acc0a-config-data" (OuterVolumeSpecName: "config-data") pod "fc12fba5-2742-45fd-b63c-51b3201acc0a" (UID: "fc12fba5-2742-45fd-b63c-51b3201acc0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.112811 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6sdd\" (UniqueName: \"kubernetes.io/projected/fc12fba5-2742-45fd-b63c-51b3201acc0a-kube-api-access-n6sdd\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.112844 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc12fba5-2742-45fd-b63c-51b3201acc0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.112857 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc12fba5-2742-45fd-b63c-51b3201acc0a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.159268 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.214931 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fbea500-bbb8-4691-817f-e06a76aba29a-ovsdbserver-sb\") pod \"2fbea500-bbb8-4691-817f-e06a76aba29a\" (UID: \"2fbea500-bbb8-4691-817f-e06a76aba29a\") " Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.215252 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fbea500-bbb8-4691-817f-e06a76aba29a-dns-svc\") pod \"2fbea500-bbb8-4691-817f-e06a76aba29a\" (UID: \"2fbea500-bbb8-4691-817f-e06a76aba29a\") " Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.215298 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fbea500-bbb8-4691-817f-e06a76aba29a-ovsdbserver-nb\") pod \"2fbea500-bbb8-4691-817f-e06a76aba29a\" (UID: \"2fbea500-bbb8-4691-817f-e06a76aba29a\") " Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.215400 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6rxv\" (UniqueName: \"kubernetes.io/projected/2fbea500-bbb8-4691-817f-e06a76aba29a-kube-api-access-r6rxv\") pod \"2fbea500-bbb8-4691-817f-e06a76aba29a\" (UID: \"2fbea500-bbb8-4691-817f-e06a76aba29a\") " Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.215430 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fbea500-bbb8-4691-817f-e06a76aba29a-config\") pod \"2fbea500-bbb8-4691-817f-e06a76aba29a\" (UID: \"2fbea500-bbb8-4691-817f-e06a76aba29a\") " Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.225661 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fbea500-bbb8-4691-817f-e06a76aba29a-kube-api-access-r6rxv" (OuterVolumeSpecName: "kube-api-access-r6rxv") pod "2fbea500-bbb8-4691-817f-e06a76aba29a" (UID: "2fbea500-bbb8-4691-817f-e06a76aba29a"). InnerVolumeSpecName "kube-api-access-r6rxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.262792 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fbea500-bbb8-4691-817f-e06a76aba29a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2fbea500-bbb8-4691-817f-e06a76aba29a" (UID: "2fbea500-bbb8-4691-817f-e06a76aba29a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.264567 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fbea500-bbb8-4691-817f-e06a76aba29a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2fbea500-bbb8-4691-817f-e06a76aba29a" (UID: "2fbea500-bbb8-4691-817f-e06a76aba29a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.268666 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fbea500-bbb8-4691-817f-e06a76aba29a-config" (OuterVolumeSpecName: "config") pod "2fbea500-bbb8-4691-817f-e06a76aba29a" (UID: "2fbea500-bbb8-4691-817f-e06a76aba29a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.282830 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fbea500-bbb8-4691-817f-e06a76aba29a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2fbea500-bbb8-4691-817f-e06a76aba29a" (UID: "2fbea500-bbb8-4691-817f-e06a76aba29a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.316158 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fbea500-bbb8-4691-817f-e06a76aba29a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.316187 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fbea500-bbb8-4691-817f-e06a76aba29a-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.316196 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fbea500-bbb8-4691-817f-e06a76aba29a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.316208 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6rxv\" (UniqueName: \"kubernetes.io/projected/2fbea500-bbb8-4691-817f-e06a76aba29a-kube-api-access-r6rxv\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.316218 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fbea500-bbb8-4691-817f-e06a76aba29a-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.966919 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" event={"ID":"2fbea500-bbb8-4691-817f-e06a76aba29a","Type":"ContainerDied","Data":"41d35864dd4a35ded697e9ebbf23c62afd4fbb0ac262bc640c4e25a8817780cd"} Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.967025 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-mgg5f" Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.967287 4760 scope.go:117] "RemoveContainer" containerID="31b22899c626cee9f680e860115414b973b37b3a83e8fb751b0242cdfb1e1aa2" Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.969543 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-mpstv" event={"ID":"cf25d474-c105-4c8b-87ad-0911e245056f","Type":"ContainerStarted","Data":"8ff322d29972321c37183515cd622af66909c70fa1837cb7b3f96ab838b1554a"} Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.973007 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kxxsc" Sep 30 07:50:14 crc kubenswrapper[4760]: I0930 07:50:14.976115 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s45t8" event={"ID":"09c9e208-48cc-44b8-9810-fc0cf69cea8a","Type":"ContainerStarted","Data":"1709f319e4cbbd5d960887fc4785bf4c61a3094232f12625c2012c901d224818"} Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.003547 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-mpstv" podStartSLOduration=2.12542494 podStartE2EDuration="10.003527644s" podCreationTimestamp="2025-09-30 07:50:05 +0000 UTC" firstStartedPulling="2025-09-30 07:50:06.113695846 +0000 UTC m=+991.756602258" lastFinishedPulling="2025-09-30 07:50:13.99179855 +0000 UTC m=+999.634704962" observedRunningTime="2025-09-30 07:50:14.997016038 +0000 UTC m=+1000.639922470" watchObservedRunningTime="2025-09-30 07:50:15.003527644 +0000 UTC m=+1000.646434066" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.004003 4760 scope.go:117] "RemoveContainer" containerID="d119665380541c10bfbc4d9ec60f2b77518528e25a6a8a983ef92659f7f1ec21" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.027303 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-s45t8" podStartSLOduration=2.728422416 podStartE2EDuration="32.027260659s" podCreationTimestamp="2025-09-30 07:49:43 +0000 UTC" firstStartedPulling="2025-09-30 07:49:44.670192106 +0000 UTC m=+970.313098518" lastFinishedPulling="2025-09-30 07:50:13.969030349 +0000 UTC m=+999.611936761" observedRunningTime="2025-09-30 07:50:15.017446509 +0000 UTC m=+1000.660352961" watchObservedRunningTime="2025-09-30 07:50:15.027260659 +0000 UTC m=+1000.670167081" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.052371 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-mgg5f"] Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.058130 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-mgg5f"] Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.077232 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fbea500-bbb8-4691-817f-e06a76aba29a" path="/var/lib/kubelet/pods/2fbea500-bbb8-4691-817f-e06a76aba29a/volumes" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.216598 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-9ps8z"] Sep 30 07:50:15 crc kubenswrapper[4760]: E0930 07:50:15.217605 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc12fba5-2742-45fd-b63c-51b3201acc0a" containerName="keystone-db-sync" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.217624 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc12fba5-2742-45fd-b63c-51b3201acc0a" containerName="keystone-db-sync" Sep 30 07:50:15 crc kubenswrapper[4760]: E0930 07:50:15.217649 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fbea500-bbb8-4691-817f-e06a76aba29a" containerName="init" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.217658 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fbea500-bbb8-4691-817f-e06a76aba29a" containerName="init" Sep 30 07:50:15 crc kubenswrapper[4760]: E0930 07:50:15.217667 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fbea500-bbb8-4691-817f-e06a76aba29a" containerName="dnsmasq-dns" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.217674 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fbea500-bbb8-4691-817f-e06a76aba29a" containerName="dnsmasq-dns" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.218121 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fbea500-bbb8-4691-817f-e06a76aba29a" containerName="dnsmasq-dns" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.218141 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc12fba5-2742-45fd-b63c-51b3201acc0a" containerName="keystone-db-sync" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.219060 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8c45789f-9ps8z" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.248828 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-9ps8z"] Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.250270 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-dns-svc\") pod \"dnsmasq-dns-6f8c45789f-9ps8z\" (UID: \"439d75d2-8c13-4f14-9c93-c0a165a439ce\") " pod="openstack/dnsmasq-dns-6f8c45789f-9ps8z" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.250325 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-ovsdbserver-nb\") pod \"dnsmasq-dns-6f8c45789f-9ps8z\" (UID: \"439d75d2-8c13-4f14-9c93-c0a165a439ce\") " pod="openstack/dnsmasq-dns-6f8c45789f-9ps8z" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.250356 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-dns-swift-storage-0\") pod \"dnsmasq-dns-6f8c45789f-9ps8z\" (UID: \"439d75d2-8c13-4f14-9c93-c0a165a439ce\") " pod="openstack/dnsmasq-dns-6f8c45789f-9ps8z" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.250422 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-config\") pod \"dnsmasq-dns-6f8c45789f-9ps8z\" (UID: \"439d75d2-8c13-4f14-9c93-c0a165a439ce\") " pod="openstack/dnsmasq-dns-6f8c45789f-9ps8z" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.250456 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrs6z\" (UniqueName: \"kubernetes.io/projected/439d75d2-8c13-4f14-9c93-c0a165a439ce-kube-api-access-wrs6z\") pod \"dnsmasq-dns-6f8c45789f-9ps8z\" (UID: \"439d75d2-8c13-4f14-9c93-c0a165a439ce\") " pod="openstack/dnsmasq-dns-6f8c45789f-9ps8z" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.250518 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-ovsdbserver-sb\") pod \"dnsmasq-dns-6f8c45789f-9ps8z\" (UID: \"439d75d2-8c13-4f14-9c93-c0a165a439ce\") " pod="openstack/dnsmasq-dns-6f8c45789f-9ps8z" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.260790 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-kbxt8"] Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.262153 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kbxt8" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.268157 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-gx9h7" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.268355 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.268463 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.269101 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.291296 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kbxt8"] Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.352422 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrs6z\" (UniqueName: \"kubernetes.io/projected/439d75d2-8c13-4f14-9c93-c0a165a439ce-kube-api-access-wrs6z\") pod \"dnsmasq-dns-6f8c45789f-9ps8z\" (UID: \"439d75d2-8c13-4f14-9c93-c0a165a439ce\") " pod="openstack/dnsmasq-dns-6f8c45789f-9ps8z" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.352467 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-credential-keys\") pod \"keystone-bootstrap-kbxt8\" (UID: \"e00788e1-5a38-4884-8884-1be4c2ceca22\") " pod="openstack/keystone-bootstrap-kbxt8" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.352509 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-ovsdbserver-sb\") pod \"dnsmasq-dns-6f8c45789f-9ps8z\" (UID: \"439d75d2-8c13-4f14-9c93-c0a165a439ce\") " pod="openstack/dnsmasq-dns-6f8c45789f-9ps8z" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.352530 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-config-data\") pod \"keystone-bootstrap-kbxt8\" (UID: \"e00788e1-5a38-4884-8884-1be4c2ceca22\") " pod="openstack/keystone-bootstrap-kbxt8" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.352999 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-dns-svc\") pod \"dnsmasq-dns-6f8c45789f-9ps8z\" (UID: \"439d75d2-8c13-4f14-9c93-c0a165a439ce\") " pod="openstack/dnsmasq-dns-6f8c45789f-9ps8z" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.353035 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-ovsdbserver-nb\") pod \"dnsmasq-dns-6f8c45789f-9ps8z\" (UID: \"439d75d2-8c13-4f14-9c93-c0a165a439ce\") " pod="openstack/dnsmasq-dns-6f8c45789f-9ps8z" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.353059 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t6ls\" (UniqueName: \"kubernetes.io/projected/e00788e1-5a38-4884-8884-1be4c2ceca22-kube-api-access-2t6ls\") pod \"keystone-bootstrap-kbxt8\" (UID: \"e00788e1-5a38-4884-8884-1be4c2ceca22\") " pod="openstack/keystone-bootstrap-kbxt8" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.353087 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-dns-swift-storage-0\") pod \"dnsmasq-dns-6f8c45789f-9ps8z\" (UID: \"439d75d2-8c13-4f14-9c93-c0a165a439ce\") " pod="openstack/dnsmasq-dns-6f8c45789f-9ps8z" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.353138 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-scripts\") pod \"keystone-bootstrap-kbxt8\" (UID: \"e00788e1-5a38-4884-8884-1be4c2ceca22\") " pod="openstack/keystone-bootstrap-kbxt8" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.353158 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-combined-ca-bundle\") pod \"keystone-bootstrap-kbxt8\" (UID: \"e00788e1-5a38-4884-8884-1be4c2ceca22\") " pod="openstack/keystone-bootstrap-kbxt8" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.353196 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-fernet-keys\") pod \"keystone-bootstrap-kbxt8\" (UID: \"e00788e1-5a38-4884-8884-1be4c2ceca22\") " pod="openstack/keystone-bootstrap-kbxt8" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.353217 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-config\") pod \"dnsmasq-dns-6f8c45789f-9ps8z\" (UID: \"439d75d2-8c13-4f14-9c93-c0a165a439ce\") " pod="openstack/dnsmasq-dns-6f8c45789f-9ps8z" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.353690 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-ovsdbserver-sb\") pod \"dnsmasq-dns-6f8c45789f-9ps8z\" (UID: \"439d75d2-8c13-4f14-9c93-c0a165a439ce\") " pod="openstack/dnsmasq-dns-6f8c45789f-9ps8z" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.353793 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-dns-svc\") pod \"dnsmasq-dns-6f8c45789f-9ps8z\" (UID: \"439d75d2-8c13-4f14-9c93-c0a165a439ce\") " pod="openstack/dnsmasq-dns-6f8c45789f-9ps8z" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.353888 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-dns-swift-storage-0\") pod \"dnsmasq-dns-6f8c45789f-9ps8z\" (UID: \"439d75d2-8c13-4f14-9c93-c0a165a439ce\") " pod="openstack/dnsmasq-dns-6f8c45789f-9ps8z" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.354033 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-config\") pod \"dnsmasq-dns-6f8c45789f-9ps8z\" (UID: \"439d75d2-8c13-4f14-9c93-c0a165a439ce\") " pod="openstack/dnsmasq-dns-6f8c45789f-9ps8z" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.354294 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-ovsdbserver-nb\") pod \"dnsmasq-dns-6f8c45789f-9ps8z\" (UID: \"439d75d2-8c13-4f14-9c93-c0a165a439ce\") " pod="openstack/dnsmasq-dns-6f8c45789f-9ps8z" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.371653 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrs6z\" (UniqueName: \"kubernetes.io/projected/439d75d2-8c13-4f14-9c93-c0a165a439ce-kube-api-access-wrs6z\") pod \"dnsmasq-dns-6f8c45789f-9ps8z\" (UID: \"439d75d2-8c13-4f14-9c93-c0a165a439ce\") " pod="openstack/dnsmasq-dns-6f8c45789f-9ps8z" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.375362 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-746f884dcc-8lkjw"] Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.377324 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-746f884dcc-8lkjw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.385748 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.385896 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.385999 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-fm4q6" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.388712 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.403801 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-746f884dcc-8lkjw"] Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.455019 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8f42596-0ba9-41d9-a780-b8b3705b963c-logs\") pod \"horizon-746f884dcc-8lkjw\" (UID: \"f8f42596-0ba9-41d9-a780-b8b3705b963c\") " pod="openstack/horizon-746f884dcc-8lkjw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.455227 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f8f42596-0ba9-41d9-a780-b8b3705b963c-horizon-secret-key\") pod \"horizon-746f884dcc-8lkjw\" (UID: \"f8f42596-0ba9-41d9-a780-b8b3705b963c\") " pod="openstack/horizon-746f884dcc-8lkjw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.455353 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t6ls\" (UniqueName: \"kubernetes.io/projected/e00788e1-5a38-4884-8884-1be4c2ceca22-kube-api-access-2t6ls\") pod \"keystone-bootstrap-kbxt8\" (UID: \"e00788e1-5a38-4884-8884-1be4c2ceca22\") " pod="openstack/keystone-bootstrap-kbxt8" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.455432 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g8dc\" (UniqueName: \"kubernetes.io/projected/f8f42596-0ba9-41d9-a780-b8b3705b963c-kube-api-access-9g8dc\") pod \"horizon-746f884dcc-8lkjw\" (UID: \"f8f42596-0ba9-41d9-a780-b8b3705b963c\") " pod="openstack/horizon-746f884dcc-8lkjw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.455505 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-scripts\") pod \"keystone-bootstrap-kbxt8\" (UID: \"e00788e1-5a38-4884-8884-1be4c2ceca22\") " pod="openstack/keystone-bootstrap-kbxt8" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.455577 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-combined-ca-bundle\") pod \"keystone-bootstrap-kbxt8\" (UID: \"e00788e1-5a38-4884-8884-1be4c2ceca22\") " pod="openstack/keystone-bootstrap-kbxt8" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.455658 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-fernet-keys\") pod \"keystone-bootstrap-kbxt8\" (UID: \"e00788e1-5a38-4884-8884-1be4c2ceca22\") " pod="openstack/keystone-bootstrap-kbxt8" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.455736 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-credential-keys\") pod \"keystone-bootstrap-kbxt8\" (UID: \"e00788e1-5a38-4884-8884-1be4c2ceca22\") " pod="openstack/keystone-bootstrap-kbxt8" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.455804 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8f42596-0ba9-41d9-a780-b8b3705b963c-scripts\") pod \"horizon-746f884dcc-8lkjw\" (UID: \"f8f42596-0ba9-41d9-a780-b8b3705b963c\") " pod="openstack/horizon-746f884dcc-8lkjw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.455887 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-config-data\") pod \"keystone-bootstrap-kbxt8\" (UID: \"e00788e1-5a38-4884-8884-1be4c2ceca22\") " pod="openstack/keystone-bootstrap-kbxt8" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.455969 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8f42596-0ba9-41d9-a780-b8b3705b963c-config-data\") pod \"horizon-746f884dcc-8lkjw\" (UID: \"f8f42596-0ba9-41d9-a780-b8b3705b963c\") " pod="openstack/horizon-746f884dcc-8lkjw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.464086 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-config-data\") pod \"keystone-bootstrap-kbxt8\" (UID: \"e00788e1-5a38-4884-8884-1be4c2ceca22\") " pod="openstack/keystone-bootstrap-kbxt8" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.464642 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-combined-ca-bundle\") pod \"keystone-bootstrap-kbxt8\" (UID: \"e00788e1-5a38-4884-8884-1be4c2ceca22\") " pod="openstack/keystone-bootstrap-kbxt8" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.473169 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-credential-keys\") pod \"keystone-bootstrap-kbxt8\" (UID: \"e00788e1-5a38-4884-8884-1be4c2ceca22\") " pod="openstack/keystone-bootstrap-kbxt8" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.476231 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-fernet-keys\") pod \"keystone-bootstrap-kbxt8\" (UID: \"e00788e1-5a38-4884-8884-1be4c2ceca22\") " pod="openstack/keystone-bootstrap-kbxt8" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.482386 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.483012 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-scripts\") pod \"keystone-bootstrap-kbxt8\" (UID: \"e00788e1-5a38-4884-8884-1be4c2ceca22\") " pod="openstack/keystone-bootstrap-kbxt8" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.484219 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t6ls\" (UniqueName: \"kubernetes.io/projected/e00788e1-5a38-4884-8884-1be4c2ceca22-kube-api-access-2t6ls\") pod \"keystone-bootstrap-kbxt8\" (UID: \"e00788e1-5a38-4884-8884-1be4c2ceca22\") " pod="openstack/keystone-bootstrap-kbxt8" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.491181 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.503188 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.503346 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.503705 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.541518 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8c45789f-9ps8z" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.557711 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a555f66-7027-4fac-afcc-db7b3f5ae034-run-httpd\") pod \"ceilometer-0\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " pod="openstack/ceilometer-0" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.557760 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8f42596-0ba9-41d9-a780-b8b3705b963c-scripts\") pod \"horizon-746f884dcc-8lkjw\" (UID: \"f8f42596-0ba9-41d9-a780-b8b3705b963c\") " pod="openstack/horizon-746f884dcc-8lkjw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.557777 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a555f66-7027-4fac-afcc-db7b3f5ae034-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " pod="openstack/ceilometer-0" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.557830 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8f42596-0ba9-41d9-a780-b8b3705b963c-config-data\") pod \"horizon-746f884dcc-8lkjw\" (UID: \"f8f42596-0ba9-41d9-a780-b8b3705b963c\") " pod="openstack/horizon-746f884dcc-8lkjw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.557850 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8f42596-0ba9-41d9-a780-b8b3705b963c-logs\") pod \"horizon-746f884dcc-8lkjw\" (UID: \"f8f42596-0ba9-41d9-a780-b8b3705b963c\") " pod="openstack/horizon-746f884dcc-8lkjw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.557880 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f8f42596-0ba9-41d9-a780-b8b3705b963c-horizon-secret-key\") pod \"horizon-746f884dcc-8lkjw\" (UID: \"f8f42596-0ba9-41d9-a780-b8b3705b963c\") " pod="openstack/horizon-746f884dcc-8lkjw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.557899 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a555f66-7027-4fac-afcc-db7b3f5ae034-scripts\") pod \"ceilometer-0\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " pod="openstack/ceilometer-0" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.557923 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a555f66-7027-4fac-afcc-db7b3f5ae034-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " pod="openstack/ceilometer-0" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.557961 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a555f66-7027-4fac-afcc-db7b3f5ae034-log-httpd\") pod \"ceilometer-0\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " pod="openstack/ceilometer-0" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.557975 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a555f66-7027-4fac-afcc-db7b3f5ae034-config-data\") pod \"ceilometer-0\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " pod="openstack/ceilometer-0" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.558012 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g8dc\" (UniqueName: \"kubernetes.io/projected/f8f42596-0ba9-41d9-a780-b8b3705b963c-kube-api-access-9g8dc\") pod \"horizon-746f884dcc-8lkjw\" (UID: \"f8f42596-0ba9-41d9-a780-b8b3705b963c\") " pod="openstack/horizon-746f884dcc-8lkjw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.558033 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggb9w\" (UniqueName: \"kubernetes.io/projected/7a555f66-7027-4fac-afcc-db7b3f5ae034-kube-api-access-ggb9w\") pod \"ceilometer-0\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " pod="openstack/ceilometer-0" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.559316 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8f42596-0ba9-41d9-a780-b8b3705b963c-scripts\") pod \"horizon-746f884dcc-8lkjw\" (UID: \"f8f42596-0ba9-41d9-a780-b8b3705b963c\") " pod="openstack/horizon-746f884dcc-8lkjw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.559833 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8f42596-0ba9-41d9-a780-b8b3705b963c-logs\") pod \"horizon-746f884dcc-8lkjw\" (UID: \"f8f42596-0ba9-41d9-a780-b8b3705b963c\") " pod="openstack/horizon-746f884dcc-8lkjw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.561934 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8f42596-0ba9-41d9-a780-b8b3705b963c-config-data\") pod \"horizon-746f884dcc-8lkjw\" (UID: \"f8f42596-0ba9-41d9-a780-b8b3705b963c\") " pod="openstack/horizon-746f884dcc-8lkjw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.574218 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f8f42596-0ba9-41d9-a780-b8b3705b963c-horizon-secret-key\") pod \"horizon-746f884dcc-8lkjw\" (UID: \"f8f42596-0ba9-41d9-a780-b8b3705b963c\") " pod="openstack/horizon-746f884dcc-8lkjw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.586496 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-9ps8z"] Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.592389 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5885b885f5-7v6mw"] Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.594066 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5885b885f5-7v6mw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.596545 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kbxt8" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.609679 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5885b885f5-7v6mw"] Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.619946 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-2npfw"] Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.620950 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2npfw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.633095 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.633290 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-wrjwv" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.633420 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.636621 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-j5p4t"] Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.637525 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g8dc\" (UniqueName: \"kubernetes.io/projected/f8f42596-0ba9-41d9-a780-b8b3705b963c-kube-api-access-9g8dc\") pod \"horizon-746f884dcc-8lkjw\" (UID: \"f8f42596-0ba9-41d9-a780-b8b3705b963c\") " pod="openstack/horizon-746f884dcc-8lkjw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.645026 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.645842 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2npfw"] Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.662044 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a555f66-7027-4fac-afcc-db7b3f5ae034-run-httpd\") pod \"ceilometer-0\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " pod="openstack/ceilometer-0" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.662096 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-ovsdbserver-sb\") pod \"dnsmasq-dns-fcfdd6f9f-j5p4t\" (UID: \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.662138 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4gh5\" (UniqueName: \"kubernetes.io/projected/4584a821-629e-4246-95d3-b84160a0f46c-kube-api-access-j4gh5\") pod \"placement-db-sync-2npfw\" (UID: \"4584a821-629e-4246-95d3-b84160a0f46c\") " pod="openstack/placement-db-sync-2npfw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.662160 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a555f66-7027-4fac-afcc-db7b3f5ae034-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " pod="openstack/ceilometer-0" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.662189 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltmzc\" (UniqueName: \"kubernetes.io/projected/da7c2ace-093d-4279-bad0-5f2876f4ab8d-kube-api-access-ltmzc\") pod \"dnsmasq-dns-fcfdd6f9f-j5p4t\" (UID: \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.662222 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-logs\") pod \"horizon-5885b885f5-7v6mw\" (UID: \"0d907232-c03b-44a4-a0b5-36ce5bf6d62b\") " pod="openstack/horizon-5885b885f5-7v6mw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.662266 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4584a821-629e-4246-95d3-b84160a0f46c-logs\") pod \"placement-db-sync-2npfw\" (UID: \"4584a821-629e-4246-95d3-b84160a0f46c\") " pod="openstack/placement-db-sync-2npfw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.662291 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4584a821-629e-4246-95d3-b84160a0f46c-combined-ca-bundle\") pod \"placement-db-sync-2npfw\" (UID: \"4584a821-629e-4246-95d3-b84160a0f46c\") " pod="openstack/placement-db-sync-2npfw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.662332 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m75zc\" (UniqueName: \"kubernetes.io/projected/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-kube-api-access-m75zc\") pod \"horizon-5885b885f5-7v6mw\" (UID: \"0d907232-c03b-44a4-a0b5-36ce5bf6d62b\") " pod="openstack/horizon-5885b885f5-7v6mw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.662350 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4584a821-629e-4246-95d3-b84160a0f46c-scripts\") pod \"placement-db-sync-2npfw\" (UID: \"4584a821-629e-4246-95d3-b84160a0f46c\") " pod="openstack/placement-db-sync-2npfw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.662369 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-dns-svc\") pod \"dnsmasq-dns-fcfdd6f9f-j5p4t\" (UID: \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.662396 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-config-data\") pod \"horizon-5885b885f5-7v6mw\" (UID: \"0d907232-c03b-44a4-a0b5-36ce5bf6d62b\") " pod="openstack/horizon-5885b885f5-7v6mw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.662425 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4584a821-629e-4246-95d3-b84160a0f46c-config-data\") pod \"placement-db-sync-2npfw\" (UID: \"4584a821-629e-4246-95d3-b84160a0f46c\") " pod="openstack/placement-db-sync-2npfw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.662450 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-config\") pod \"dnsmasq-dns-fcfdd6f9f-j5p4t\" (UID: \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.662468 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-dns-swift-storage-0\") pod \"dnsmasq-dns-fcfdd6f9f-j5p4t\" (UID: \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.662485 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a555f66-7027-4fac-afcc-db7b3f5ae034-scripts\") pod \"ceilometer-0\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " pod="openstack/ceilometer-0" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.662506 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a555f66-7027-4fac-afcc-db7b3f5ae034-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " pod="openstack/ceilometer-0" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.662532 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-scripts\") pod \"horizon-5885b885f5-7v6mw\" (UID: \"0d907232-c03b-44a4-a0b5-36ce5bf6d62b\") " pod="openstack/horizon-5885b885f5-7v6mw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.662548 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a555f66-7027-4fac-afcc-db7b3f5ae034-log-httpd\") pod \"ceilometer-0\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " pod="openstack/ceilometer-0" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.662566 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a555f66-7027-4fac-afcc-db7b3f5ae034-config-data\") pod \"ceilometer-0\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " pod="openstack/ceilometer-0" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.662604 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-ovsdbserver-nb\") pod \"dnsmasq-dns-fcfdd6f9f-j5p4t\" (UID: \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.662626 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggb9w\" (UniqueName: \"kubernetes.io/projected/7a555f66-7027-4fac-afcc-db7b3f5ae034-kube-api-access-ggb9w\") pod \"ceilometer-0\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " pod="openstack/ceilometer-0" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.662661 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-horizon-secret-key\") pod \"horizon-5885b885f5-7v6mw\" (UID: \"0d907232-c03b-44a4-a0b5-36ce5bf6d62b\") " pod="openstack/horizon-5885b885f5-7v6mw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.663269 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a555f66-7027-4fac-afcc-db7b3f5ae034-log-httpd\") pod \"ceilometer-0\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " pod="openstack/ceilometer-0" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.663989 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a555f66-7027-4fac-afcc-db7b3f5ae034-run-httpd\") pod \"ceilometer-0\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " pod="openstack/ceilometer-0" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.676924 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a555f66-7027-4fac-afcc-db7b3f5ae034-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " pod="openstack/ceilometer-0" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.690048 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a555f66-7027-4fac-afcc-db7b3f5ae034-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " pod="openstack/ceilometer-0" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.690755 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a555f66-7027-4fac-afcc-db7b3f5ae034-config-data\") pod \"ceilometer-0\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " pod="openstack/ceilometer-0" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.691217 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a555f66-7027-4fac-afcc-db7b3f5ae034-scripts\") pod \"ceilometer-0\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " pod="openstack/ceilometer-0" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.732188 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-746f884dcc-8lkjw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.732744 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-j5p4t"] Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.757660 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggb9w\" (UniqueName: \"kubernetes.io/projected/7a555f66-7027-4fac-afcc-db7b3f5ae034-kube-api-access-ggb9w\") pod \"ceilometer-0\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " pod="openstack/ceilometer-0" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.764821 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-horizon-secret-key\") pod \"horizon-5885b885f5-7v6mw\" (UID: \"0d907232-c03b-44a4-a0b5-36ce5bf6d62b\") " pod="openstack/horizon-5885b885f5-7v6mw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.764881 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-ovsdbserver-sb\") pod \"dnsmasq-dns-fcfdd6f9f-j5p4t\" (UID: \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.764908 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4gh5\" (UniqueName: \"kubernetes.io/projected/4584a821-629e-4246-95d3-b84160a0f46c-kube-api-access-j4gh5\") pod \"placement-db-sync-2npfw\" (UID: \"4584a821-629e-4246-95d3-b84160a0f46c\") " pod="openstack/placement-db-sync-2npfw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.764936 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltmzc\" (UniqueName: \"kubernetes.io/projected/da7c2ace-093d-4279-bad0-5f2876f4ab8d-kube-api-access-ltmzc\") pod \"dnsmasq-dns-fcfdd6f9f-j5p4t\" (UID: \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.764959 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-logs\") pod \"horizon-5885b885f5-7v6mw\" (UID: \"0d907232-c03b-44a4-a0b5-36ce5bf6d62b\") " pod="openstack/horizon-5885b885f5-7v6mw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.764989 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4584a821-629e-4246-95d3-b84160a0f46c-logs\") pod \"placement-db-sync-2npfw\" (UID: \"4584a821-629e-4246-95d3-b84160a0f46c\") " pod="openstack/placement-db-sync-2npfw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.765009 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4584a821-629e-4246-95d3-b84160a0f46c-combined-ca-bundle\") pod \"placement-db-sync-2npfw\" (UID: \"4584a821-629e-4246-95d3-b84160a0f46c\") " pod="openstack/placement-db-sync-2npfw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.765029 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m75zc\" (UniqueName: \"kubernetes.io/projected/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-kube-api-access-m75zc\") pod \"horizon-5885b885f5-7v6mw\" (UID: \"0d907232-c03b-44a4-a0b5-36ce5bf6d62b\") " pod="openstack/horizon-5885b885f5-7v6mw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.765043 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4584a821-629e-4246-95d3-b84160a0f46c-scripts\") pod \"placement-db-sync-2npfw\" (UID: \"4584a821-629e-4246-95d3-b84160a0f46c\") " pod="openstack/placement-db-sync-2npfw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.765057 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-dns-svc\") pod \"dnsmasq-dns-fcfdd6f9f-j5p4t\" (UID: \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.765084 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-config-data\") pod \"horizon-5885b885f5-7v6mw\" (UID: \"0d907232-c03b-44a4-a0b5-36ce5bf6d62b\") " pod="openstack/horizon-5885b885f5-7v6mw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.765109 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4584a821-629e-4246-95d3-b84160a0f46c-config-data\") pod \"placement-db-sync-2npfw\" (UID: \"4584a821-629e-4246-95d3-b84160a0f46c\") " pod="openstack/placement-db-sync-2npfw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.765132 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-config\") pod \"dnsmasq-dns-fcfdd6f9f-j5p4t\" (UID: \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.765147 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-dns-swift-storage-0\") pod \"dnsmasq-dns-fcfdd6f9f-j5p4t\" (UID: \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.765172 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-scripts\") pod \"horizon-5885b885f5-7v6mw\" (UID: \"0d907232-c03b-44a4-a0b5-36ce5bf6d62b\") " pod="openstack/horizon-5885b885f5-7v6mw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.765202 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-ovsdbserver-nb\") pod \"dnsmasq-dns-fcfdd6f9f-j5p4t\" (UID: \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.766009 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-ovsdbserver-nb\") pod \"dnsmasq-dns-fcfdd6f9f-j5p4t\" (UID: \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.767016 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-ovsdbserver-sb\") pod \"dnsmasq-dns-fcfdd6f9f-j5p4t\" (UID: \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.767266 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-dns-swift-storage-0\") pod \"dnsmasq-dns-fcfdd6f9f-j5p4t\" (UID: \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.767507 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4584a821-629e-4246-95d3-b84160a0f46c-logs\") pod \"placement-db-sync-2npfw\" (UID: \"4584a821-629e-4246-95d3-b84160a0f46c\") " pod="openstack/placement-db-sync-2npfw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.767868 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-config\") pod \"dnsmasq-dns-fcfdd6f9f-j5p4t\" (UID: \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.769546 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-logs\") pod \"horizon-5885b885f5-7v6mw\" (UID: \"0d907232-c03b-44a4-a0b5-36ce5bf6d62b\") " pod="openstack/horizon-5885b885f5-7v6mw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.769974 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-scripts\") pod \"horizon-5885b885f5-7v6mw\" (UID: \"0d907232-c03b-44a4-a0b5-36ce5bf6d62b\") " pod="openstack/horizon-5885b885f5-7v6mw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.770048 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4584a821-629e-4246-95d3-b84160a0f46c-combined-ca-bundle\") pod \"placement-db-sync-2npfw\" (UID: \"4584a821-629e-4246-95d3-b84160a0f46c\") " pod="openstack/placement-db-sync-2npfw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.770059 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-dns-svc\") pod \"dnsmasq-dns-fcfdd6f9f-j5p4t\" (UID: \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.771088 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-config-data\") pod \"horizon-5885b885f5-7v6mw\" (UID: \"0d907232-c03b-44a4-a0b5-36ce5bf6d62b\") " pod="openstack/horizon-5885b885f5-7v6mw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.771247 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-horizon-secret-key\") pod \"horizon-5885b885f5-7v6mw\" (UID: \"0d907232-c03b-44a4-a0b5-36ce5bf6d62b\") " pod="openstack/horizon-5885b885f5-7v6mw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.773635 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4584a821-629e-4246-95d3-b84160a0f46c-scripts\") pod \"placement-db-sync-2npfw\" (UID: \"4584a821-629e-4246-95d3-b84160a0f46c\") " pod="openstack/placement-db-sync-2npfw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.778460 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4584a821-629e-4246-95d3-b84160a0f46c-config-data\") pod \"placement-db-sync-2npfw\" (UID: \"4584a821-629e-4246-95d3-b84160a0f46c\") " pod="openstack/placement-db-sync-2npfw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.782229 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m75zc\" (UniqueName: \"kubernetes.io/projected/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-kube-api-access-m75zc\") pod \"horizon-5885b885f5-7v6mw\" (UID: \"0d907232-c03b-44a4-a0b5-36ce5bf6d62b\") " pod="openstack/horizon-5885b885f5-7v6mw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.784041 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4gh5\" (UniqueName: \"kubernetes.io/projected/4584a821-629e-4246-95d3-b84160a0f46c-kube-api-access-j4gh5\") pod \"placement-db-sync-2npfw\" (UID: \"4584a821-629e-4246-95d3-b84160a0f46c\") " pod="openstack/placement-db-sync-2npfw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.790787 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltmzc\" (UniqueName: \"kubernetes.io/projected/da7c2ace-093d-4279-bad0-5f2876f4ab8d-kube-api-access-ltmzc\") pod \"dnsmasq-dns-fcfdd6f9f-j5p4t\" (UID: \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.886718 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.939491 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5885b885f5-7v6mw" Sep 30 07:50:15 crc kubenswrapper[4760]: I0930 07:50:15.981282 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2npfw" Sep 30 07:50:16 crc kubenswrapper[4760]: I0930 07:50:16.047697 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" Sep 30 07:50:16 crc kubenswrapper[4760]: I0930 07:50:16.158256 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-9ps8z"] Sep 30 07:50:16 crc kubenswrapper[4760]: W0930 07:50:16.172701 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod439d75d2_8c13_4f14_9c93_c0a165a439ce.slice/crio-9ef7257fb058d8ff0e625f7955d1fa9ba9b2393d024d770506560abc483d8a60 WatchSource:0}: Error finding container 9ef7257fb058d8ff0e625f7955d1fa9ba9b2393d024d770506560abc483d8a60: Status 404 returned error can't find the container with id 9ef7257fb058d8ff0e625f7955d1fa9ba9b2393d024d770506560abc483d8a60 Sep 30 07:50:16 crc kubenswrapper[4760]: I0930 07:50:16.377349 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kbxt8"] Sep 30 07:50:16 crc kubenswrapper[4760]: I0930 07:50:16.589940 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-746f884dcc-8lkjw"] Sep 30 07:50:16 crc kubenswrapper[4760]: W0930 07:50:16.749923 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d907232_c03b_44a4_a0b5_36ce5bf6d62b.slice/crio-d582e6dc3553c1c0170ca92c054ff0aeafdeb1228139dbc5ada4853684bfec0b WatchSource:0}: Error finding container d582e6dc3553c1c0170ca92c054ff0aeafdeb1228139dbc5ada4853684bfec0b: Status 404 returned error can't find the container with id d582e6dc3553c1c0170ca92c054ff0aeafdeb1228139dbc5ada4853684bfec0b Sep 30 07:50:16 crc kubenswrapper[4760]: I0930 07:50:16.757014 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5885b885f5-7v6mw"] Sep 30 07:50:16 crc kubenswrapper[4760]: I0930 07:50:16.765783 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:50:16 crc kubenswrapper[4760]: I0930 07:50:16.772055 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2npfw"] Sep 30 07:50:16 crc kubenswrapper[4760]: W0930 07:50:16.773424 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4584a821_629e_4246_95d3_b84160a0f46c.slice/crio-4697db6b8dedf7f8401d4c04ddc763909ddbdbe1dcc30888f4966672fbec9aee WatchSource:0}: Error finding container 4697db6b8dedf7f8401d4c04ddc763909ddbdbe1dcc30888f4966672fbec9aee: Status 404 returned error can't find the container with id 4697db6b8dedf7f8401d4c04ddc763909ddbdbe1dcc30888f4966672fbec9aee Sep 30 07:50:16 crc kubenswrapper[4760]: I0930 07:50:16.952223 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-j5p4t"] Sep 30 07:50:16 crc kubenswrapper[4760]: W0930 07:50:16.957458 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda7c2ace_093d_4279_bad0_5f2876f4ab8d.slice/crio-23a9ea76aa5283912ad3c948cfd07d2ef8385b05a745b19385aff4ea80cbdb5f WatchSource:0}: Error finding container 23a9ea76aa5283912ad3c948cfd07d2ef8385b05a745b19385aff4ea80cbdb5f: Status 404 returned error can't find the container with id 23a9ea76aa5283912ad3c948cfd07d2ef8385b05a745b19385aff4ea80cbdb5f Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.040064 4760 generic.go:334] "Generic (PLEG): container finished" podID="439d75d2-8c13-4f14-9c93-c0a165a439ce" containerID="f1cf9eaff822682b0a3a3cf6a258dda68f33ff296de35f5da9b3ae6b563dc996" exitCode=0 Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.040160 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8c45789f-9ps8z" event={"ID":"439d75d2-8c13-4f14-9c93-c0a165a439ce","Type":"ContainerDied","Data":"f1cf9eaff822682b0a3a3cf6a258dda68f33ff296de35f5da9b3ae6b563dc996"} Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.040195 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8c45789f-9ps8z" event={"ID":"439d75d2-8c13-4f14-9c93-c0a165a439ce","Type":"ContainerStarted","Data":"9ef7257fb058d8ff0e625f7955d1fa9ba9b2393d024d770506560abc483d8a60"} Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.042860 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a555f66-7027-4fac-afcc-db7b3f5ae034","Type":"ContainerStarted","Data":"8584c4a706f72fba480b97ce1815c189ef7ce65141c8eb80c3ce669d07421c93"} Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.147004 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2npfw" event={"ID":"4584a821-629e-4246-95d3-b84160a0f46c","Type":"ContainerStarted","Data":"4697db6b8dedf7f8401d4c04ddc763909ddbdbe1dcc30888f4966672fbec9aee"} Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.147040 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5885b885f5-7v6mw" event={"ID":"0d907232-c03b-44a4-a0b5-36ce5bf6d62b","Type":"ContainerStarted","Data":"d582e6dc3553c1c0170ca92c054ff0aeafdeb1228139dbc5ada4853684bfec0b"} Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.147049 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-746f884dcc-8lkjw" event={"ID":"f8f42596-0ba9-41d9-a780-b8b3705b963c","Type":"ContainerStarted","Data":"75950071b231471cce5a7982d702a8b964c8a295529f8611a0263a0f53e2304e"} Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.147065 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-b248-account-create-rd7nr"] Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.148004 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b248-account-create-rd7nr"] Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.148072 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b248-account-create-rd7nr" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.149206 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kbxt8" event={"ID":"e00788e1-5a38-4884-8884-1be4c2ceca22","Type":"ContainerStarted","Data":"6ec9d5a22f841fdef7de549baa0a72a7a08951cc585d48d88bdbfc47d7ac96e1"} Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.149322 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kbxt8" event={"ID":"e00788e1-5a38-4884-8884-1be4c2ceca22","Type":"ContainerStarted","Data":"b342b1719847c9e5aa478f7082d28ca03eb5a63858d30d41a1095edef6c76899"} Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.150154 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.152477 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" event={"ID":"da7c2ace-093d-4279-bad0-5f2876f4ab8d","Type":"ContainerStarted","Data":"23a9ea76aa5283912ad3c948cfd07d2ef8385b05a745b19385aff4ea80cbdb5f"} Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.195558 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-kbxt8" podStartSLOduration=2.19553585 podStartE2EDuration="2.19553585s" podCreationTimestamp="2025-09-30 07:50:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:50:17.180930907 +0000 UTC m=+1002.823837319" watchObservedRunningTime="2025-09-30 07:50:17.19553585 +0000 UTC m=+1002.838442262" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.217723 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsn9q\" (UniqueName: \"kubernetes.io/projected/a9ce45c7-d994-44b1-9f88-297bac4ae9c2-kube-api-access-vsn9q\") pod \"barbican-b248-account-create-rd7nr\" (UID: \"a9ce45c7-d994-44b1-9f88-297bac4ae9c2\") " pod="openstack/barbican-b248-account-create-rd7nr" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.319003 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsn9q\" (UniqueName: \"kubernetes.io/projected/a9ce45c7-d994-44b1-9f88-297bac4ae9c2-kube-api-access-vsn9q\") pod \"barbican-b248-account-create-rd7nr\" (UID: \"a9ce45c7-d994-44b1-9f88-297bac4ae9c2\") " pod="openstack/barbican-b248-account-create-rd7nr" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.330567 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-2f94-account-create-v8rw4"] Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.331702 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2f94-account-create-v8rw4" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.334249 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.337179 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2f94-account-create-v8rw4"] Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.341257 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsn9q\" (UniqueName: \"kubernetes.io/projected/a9ce45c7-d994-44b1-9f88-297bac4ae9c2-kube-api-access-vsn9q\") pod \"barbican-b248-account-create-rd7nr\" (UID: \"a9ce45c7-d994-44b1-9f88-297bac4ae9c2\") " pod="openstack/barbican-b248-account-create-rd7nr" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.420659 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4mwg\" (UniqueName: \"kubernetes.io/projected/218f5d61-bfb1-4609-9296-4a5b6471ea56-kube-api-access-r4mwg\") pod \"cinder-2f94-account-create-v8rw4\" (UID: \"218f5d61-bfb1-4609-9296-4a5b6471ea56\") " pod="openstack/cinder-2f94-account-create-v8rw4" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.461655 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8c45789f-9ps8z" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.495718 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b248-account-create-rd7nr" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.508132 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5885b885f5-7v6mw"] Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.521224 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrs6z\" (UniqueName: \"kubernetes.io/projected/439d75d2-8c13-4f14-9c93-c0a165a439ce-kube-api-access-wrs6z\") pod \"439d75d2-8c13-4f14-9c93-c0a165a439ce\" (UID: \"439d75d2-8c13-4f14-9c93-c0a165a439ce\") " Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.521266 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-ovsdbserver-sb\") pod \"439d75d2-8c13-4f14-9c93-c0a165a439ce\" (UID: \"439d75d2-8c13-4f14-9c93-c0a165a439ce\") " Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.521289 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-ovsdbserver-nb\") pod \"439d75d2-8c13-4f14-9c93-c0a165a439ce\" (UID: \"439d75d2-8c13-4f14-9c93-c0a165a439ce\") " Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.521528 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-dns-svc\") pod \"439d75d2-8c13-4f14-9c93-c0a165a439ce\" (UID: \"439d75d2-8c13-4f14-9c93-c0a165a439ce\") " Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.521553 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-config\") pod \"439d75d2-8c13-4f14-9c93-c0a165a439ce\" (UID: \"439d75d2-8c13-4f14-9c93-c0a165a439ce\") " Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.521575 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-dns-swift-storage-0\") pod \"439d75d2-8c13-4f14-9c93-c0a165a439ce\" (UID: \"439d75d2-8c13-4f14-9c93-c0a165a439ce\") " Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.521798 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4mwg\" (UniqueName: \"kubernetes.io/projected/218f5d61-bfb1-4609-9296-4a5b6471ea56-kube-api-access-r4mwg\") pod \"cinder-2f94-account-create-v8rw4\" (UID: \"218f5d61-bfb1-4609-9296-4a5b6471ea56\") " pod="openstack/cinder-2f94-account-create-v8rw4" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.546204 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4mwg\" (UniqueName: \"kubernetes.io/projected/218f5d61-bfb1-4609-9296-4a5b6471ea56-kube-api-access-r4mwg\") pod \"cinder-2f94-account-create-v8rw4\" (UID: \"218f5d61-bfb1-4609-9296-4a5b6471ea56\") " pod="openstack/cinder-2f94-account-create-v8rw4" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.553347 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/439d75d2-8c13-4f14-9c93-c0a165a439ce-kube-api-access-wrs6z" (OuterVolumeSpecName: "kube-api-access-wrs6z") pod "439d75d2-8c13-4f14-9c93-c0a165a439ce" (UID: "439d75d2-8c13-4f14-9c93-c0a165a439ce"). InnerVolumeSpecName "kube-api-access-wrs6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.564845 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "439d75d2-8c13-4f14-9c93-c0a165a439ce" (UID: "439d75d2-8c13-4f14-9c93-c0a165a439ce"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.564978 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "439d75d2-8c13-4f14-9c93-c0a165a439ce" (UID: "439d75d2-8c13-4f14-9c93-c0a165a439ce"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.584879 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.603040 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "439d75d2-8c13-4f14-9c93-c0a165a439ce" (UID: "439d75d2-8c13-4f14-9c93-c0a165a439ce"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.603747 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-4a5c-account-create-74j84"] Sep 30 07:50:17 crc kubenswrapper[4760]: E0930 07:50:17.604163 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="439d75d2-8c13-4f14-9c93-c0a165a439ce" containerName="init" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.604187 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="439d75d2-8c13-4f14-9c93-c0a165a439ce" containerName="init" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.604512 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="439d75d2-8c13-4f14-9c93-c0a165a439ce" containerName="init" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.605093 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4a5c-account-create-74j84" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.608816 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.611811 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6d547648b9-kxd9p"] Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.630003 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d547648b9-kxd9p" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.640514 4760 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.640592 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrs6z\" (UniqueName: \"kubernetes.io/projected/439d75d2-8c13-4f14-9c93-c0a165a439ce-kube-api-access-wrs6z\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.640659 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.640728 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.655341 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4a5c-account-create-74j84"] Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.658249 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-config" (OuterVolumeSpecName: "config") pod "439d75d2-8c13-4f14-9c93-c0a165a439ce" (UID: "439d75d2-8c13-4f14-9c93-c0a165a439ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.660858 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "439d75d2-8c13-4f14-9c93-c0a165a439ce" (UID: "439d75d2-8c13-4f14-9c93-c0a165a439ce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.667398 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2f94-account-create-v8rw4" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.686661 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d547648b9-kxd9p"] Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.742820 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsh25\" (UniqueName: \"kubernetes.io/projected/b7668a34-2cd6-4873-a55a-27839a612a2b-kube-api-access-xsh25\") pod \"horizon-6d547648b9-kxd9p\" (UID: \"b7668a34-2cd6-4873-a55a-27839a612a2b\") " pod="openstack/horizon-6d547648b9-kxd9p" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.742877 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7668a34-2cd6-4873-a55a-27839a612a2b-logs\") pod \"horizon-6d547648b9-kxd9p\" (UID: \"b7668a34-2cd6-4873-a55a-27839a612a2b\") " pod="openstack/horizon-6d547648b9-kxd9p" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.742917 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nplf9\" (UniqueName: \"kubernetes.io/projected/7390084a-0022-4943-914f-fdb71e7ec326-kube-api-access-nplf9\") pod \"neutron-4a5c-account-create-74j84\" (UID: \"7390084a-0022-4943-914f-fdb71e7ec326\") " pod="openstack/neutron-4a5c-account-create-74j84" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.742940 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b7668a34-2cd6-4873-a55a-27839a612a2b-horizon-secret-key\") pod \"horizon-6d547648b9-kxd9p\" (UID: \"b7668a34-2cd6-4873-a55a-27839a612a2b\") " pod="openstack/horizon-6d547648b9-kxd9p" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.742982 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b7668a34-2cd6-4873-a55a-27839a612a2b-config-data\") pod \"horizon-6d547648b9-kxd9p\" (UID: \"b7668a34-2cd6-4873-a55a-27839a612a2b\") " pod="openstack/horizon-6d547648b9-kxd9p" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.743001 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7668a34-2cd6-4873-a55a-27839a612a2b-scripts\") pod \"horizon-6d547648b9-kxd9p\" (UID: \"b7668a34-2cd6-4873-a55a-27839a612a2b\") " pod="openstack/horizon-6d547648b9-kxd9p" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.743057 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.743068 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439d75d2-8c13-4f14-9c93-c0a165a439ce-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.844403 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsh25\" (UniqueName: \"kubernetes.io/projected/b7668a34-2cd6-4873-a55a-27839a612a2b-kube-api-access-xsh25\") pod \"horizon-6d547648b9-kxd9p\" (UID: \"b7668a34-2cd6-4873-a55a-27839a612a2b\") " pod="openstack/horizon-6d547648b9-kxd9p" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.844463 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7668a34-2cd6-4873-a55a-27839a612a2b-logs\") pod \"horizon-6d547648b9-kxd9p\" (UID: \"b7668a34-2cd6-4873-a55a-27839a612a2b\") " pod="openstack/horizon-6d547648b9-kxd9p" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.844503 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nplf9\" (UniqueName: \"kubernetes.io/projected/7390084a-0022-4943-914f-fdb71e7ec326-kube-api-access-nplf9\") pod \"neutron-4a5c-account-create-74j84\" (UID: \"7390084a-0022-4943-914f-fdb71e7ec326\") " pod="openstack/neutron-4a5c-account-create-74j84" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.844530 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b7668a34-2cd6-4873-a55a-27839a612a2b-horizon-secret-key\") pod \"horizon-6d547648b9-kxd9p\" (UID: \"b7668a34-2cd6-4873-a55a-27839a612a2b\") " pod="openstack/horizon-6d547648b9-kxd9p" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.844771 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b7668a34-2cd6-4873-a55a-27839a612a2b-config-data\") pod \"horizon-6d547648b9-kxd9p\" (UID: \"b7668a34-2cd6-4873-a55a-27839a612a2b\") " pod="openstack/horizon-6d547648b9-kxd9p" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.844792 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7668a34-2cd6-4873-a55a-27839a612a2b-scripts\") pod \"horizon-6d547648b9-kxd9p\" (UID: \"b7668a34-2cd6-4873-a55a-27839a612a2b\") " pod="openstack/horizon-6d547648b9-kxd9p" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.845600 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7668a34-2cd6-4873-a55a-27839a612a2b-scripts\") pod \"horizon-6d547648b9-kxd9p\" (UID: \"b7668a34-2cd6-4873-a55a-27839a612a2b\") " pod="openstack/horizon-6d547648b9-kxd9p" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.845845 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7668a34-2cd6-4873-a55a-27839a612a2b-logs\") pod \"horizon-6d547648b9-kxd9p\" (UID: \"b7668a34-2cd6-4873-a55a-27839a612a2b\") " pod="openstack/horizon-6d547648b9-kxd9p" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.846522 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b7668a34-2cd6-4873-a55a-27839a612a2b-config-data\") pod \"horizon-6d547648b9-kxd9p\" (UID: \"b7668a34-2cd6-4873-a55a-27839a612a2b\") " pod="openstack/horizon-6d547648b9-kxd9p" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.861171 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b7668a34-2cd6-4873-a55a-27839a612a2b-horizon-secret-key\") pod \"horizon-6d547648b9-kxd9p\" (UID: \"b7668a34-2cd6-4873-a55a-27839a612a2b\") " pod="openstack/horizon-6d547648b9-kxd9p" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.863895 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsh25\" (UniqueName: \"kubernetes.io/projected/b7668a34-2cd6-4873-a55a-27839a612a2b-kube-api-access-xsh25\") pod \"horizon-6d547648b9-kxd9p\" (UID: \"b7668a34-2cd6-4873-a55a-27839a612a2b\") " pod="openstack/horizon-6d547648b9-kxd9p" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.864024 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nplf9\" (UniqueName: \"kubernetes.io/projected/7390084a-0022-4943-914f-fdb71e7ec326-kube-api-access-nplf9\") pod \"neutron-4a5c-account-create-74j84\" (UID: \"7390084a-0022-4943-914f-fdb71e7ec326\") " pod="openstack/neutron-4a5c-account-create-74j84" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.955193 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4a5c-account-create-74j84" Sep 30 07:50:17 crc kubenswrapper[4760]: I0930 07:50:17.974724 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d547648b9-kxd9p" Sep 30 07:50:18 crc kubenswrapper[4760]: I0930 07:50:18.070675 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b248-account-create-rd7nr"] Sep 30 07:50:18 crc kubenswrapper[4760]: I0930 07:50:18.186178 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8c45789f-9ps8z" event={"ID":"439d75d2-8c13-4f14-9c93-c0a165a439ce","Type":"ContainerDied","Data":"9ef7257fb058d8ff0e625f7955d1fa9ba9b2393d024d770506560abc483d8a60"} Sep 30 07:50:18 crc kubenswrapper[4760]: I0930 07:50:18.186249 4760 scope.go:117] "RemoveContainer" containerID="f1cf9eaff822682b0a3a3cf6a258dda68f33ff296de35f5da9b3ae6b563dc996" Sep 30 07:50:18 crc kubenswrapper[4760]: I0930 07:50:18.186200 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8c45789f-9ps8z" Sep 30 07:50:18 crc kubenswrapper[4760]: I0930 07:50:18.193923 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b248-account-create-rd7nr" event={"ID":"a9ce45c7-d994-44b1-9f88-297bac4ae9c2","Type":"ContainerStarted","Data":"53c275d4e2bc0486ef1a0bf5ed0064e6f6249ac73611a07090a2e1e5d14efaaa"} Sep 30 07:50:18 crc kubenswrapper[4760]: I0930 07:50:18.207796 4760 generic.go:334] "Generic (PLEG): container finished" podID="da7c2ace-093d-4279-bad0-5f2876f4ab8d" containerID="a5ff1c069a10d6c66f905281b77d71e51dbec9fdaea6b7f99e4c9703dc9cfce2" exitCode=0 Sep 30 07:50:18 crc kubenswrapper[4760]: I0930 07:50:18.209262 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" event={"ID":"da7c2ace-093d-4279-bad0-5f2876f4ab8d","Type":"ContainerDied","Data":"a5ff1c069a10d6c66f905281b77d71e51dbec9fdaea6b7f99e4c9703dc9cfce2"} Sep 30 07:50:18 crc kubenswrapper[4760]: I0930 07:50:18.287938 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-9ps8z"] Sep 30 07:50:18 crc kubenswrapper[4760]: I0930 07:50:18.296389 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-9ps8z"] Sep 30 07:50:18 crc kubenswrapper[4760]: I0930 07:50:18.316976 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2f94-account-create-v8rw4"] Sep 30 07:50:18 crc kubenswrapper[4760]: W0930 07:50:18.323488 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod218f5d61_bfb1_4609_9296_4a5b6471ea56.slice/crio-2b052913dc5bbbaabdeea224845841467dc9c1c40b91e66b2bbb7da3f4a724e6 WatchSource:0}: Error finding container 2b052913dc5bbbaabdeea224845841467dc9c1c40b91e66b2bbb7da3f4a724e6: Status 404 returned error can't find the container with id 2b052913dc5bbbaabdeea224845841467dc9c1c40b91e66b2bbb7da3f4a724e6 Sep 30 07:50:18 crc kubenswrapper[4760]: I0930 07:50:18.572506 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d547648b9-kxd9p"] Sep 30 07:50:18 crc kubenswrapper[4760]: W0930 07:50:18.599640 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7668a34_2cd6_4873_a55a_27839a612a2b.slice/crio-57575924b02b4da28daeadfe08ce08924dc06c0caa5dd5f39dfbf68bcfa03848 WatchSource:0}: Error finding container 57575924b02b4da28daeadfe08ce08924dc06c0caa5dd5f39dfbf68bcfa03848: Status 404 returned error can't find the container with id 57575924b02b4da28daeadfe08ce08924dc06c0caa5dd5f39dfbf68bcfa03848 Sep 30 07:50:18 crc kubenswrapper[4760]: I0930 07:50:18.662786 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4a5c-account-create-74j84"] Sep 30 07:50:18 crc kubenswrapper[4760]: W0930 07:50:18.675845 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7390084a_0022_4943_914f_fdb71e7ec326.slice/crio-e56125de8cb376a58e19ff7fb20ed782e15de7523a498adbb8c0fb68672cad7d WatchSource:0}: Error finding container e56125de8cb376a58e19ff7fb20ed782e15de7523a498adbb8c0fb68672cad7d: Status 404 returned error can't find the container with id e56125de8cb376a58e19ff7fb20ed782e15de7523a498adbb8c0fb68672cad7d Sep 30 07:50:19 crc kubenswrapper[4760]: I0930 07:50:19.091585 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="439d75d2-8c13-4f14-9c93-c0a165a439ce" path="/var/lib/kubelet/pods/439d75d2-8c13-4f14-9c93-c0a165a439ce/volumes" Sep 30 07:50:19 crc kubenswrapper[4760]: I0930 07:50:19.222089 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d547648b9-kxd9p" event={"ID":"b7668a34-2cd6-4873-a55a-27839a612a2b","Type":"ContainerStarted","Data":"57575924b02b4da28daeadfe08ce08924dc06c0caa5dd5f39dfbf68bcfa03848"} Sep 30 07:50:19 crc kubenswrapper[4760]: I0930 07:50:19.223522 4760 generic.go:334] "Generic (PLEG): container finished" podID="a9ce45c7-d994-44b1-9f88-297bac4ae9c2" containerID="239b861a506354fb750084154ec800cbd2d3fc062d4e82e7db73dd4f78aec132" exitCode=0 Sep 30 07:50:19 crc kubenswrapper[4760]: I0930 07:50:19.223587 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b248-account-create-rd7nr" event={"ID":"a9ce45c7-d994-44b1-9f88-297bac4ae9c2","Type":"ContainerDied","Data":"239b861a506354fb750084154ec800cbd2d3fc062d4e82e7db73dd4f78aec132"} Sep 30 07:50:19 crc kubenswrapper[4760]: I0930 07:50:19.237499 4760 generic.go:334] "Generic (PLEG): container finished" podID="7390084a-0022-4943-914f-fdb71e7ec326" containerID="bdd4ea835d0289cf353876c7bf6351106595165c513520cdcb0656f3058af201" exitCode=0 Sep 30 07:50:19 crc kubenswrapper[4760]: I0930 07:50:19.237579 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4a5c-account-create-74j84" event={"ID":"7390084a-0022-4943-914f-fdb71e7ec326","Type":"ContainerDied","Data":"bdd4ea835d0289cf353876c7bf6351106595165c513520cdcb0656f3058af201"} Sep 30 07:50:19 crc kubenswrapper[4760]: I0930 07:50:19.237604 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4a5c-account-create-74j84" event={"ID":"7390084a-0022-4943-914f-fdb71e7ec326","Type":"ContainerStarted","Data":"e56125de8cb376a58e19ff7fb20ed782e15de7523a498adbb8c0fb68672cad7d"} Sep 30 07:50:19 crc kubenswrapper[4760]: I0930 07:50:19.256236 4760 generic.go:334] "Generic (PLEG): container finished" podID="218f5d61-bfb1-4609-9296-4a5b6471ea56" containerID="07e370e1283fea479cd7b02061f78e0b1d8778265c1bd0c690a3650449e5f549" exitCode=0 Sep 30 07:50:19 crc kubenswrapper[4760]: I0930 07:50:19.256336 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2f94-account-create-v8rw4" event={"ID":"218f5d61-bfb1-4609-9296-4a5b6471ea56","Type":"ContainerDied","Data":"07e370e1283fea479cd7b02061f78e0b1d8778265c1bd0c690a3650449e5f549"} Sep 30 07:50:19 crc kubenswrapper[4760]: I0930 07:50:19.256358 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2f94-account-create-v8rw4" event={"ID":"218f5d61-bfb1-4609-9296-4a5b6471ea56","Type":"ContainerStarted","Data":"2b052913dc5bbbaabdeea224845841467dc9c1c40b91e66b2bbb7da3f4a724e6"} Sep 30 07:50:19 crc kubenswrapper[4760]: I0930 07:50:19.262413 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" event={"ID":"da7c2ace-093d-4279-bad0-5f2876f4ab8d","Type":"ContainerStarted","Data":"bcf47ca8284d148fa2358a3feb62e90bc3ea5d9a074723dd52bfb40d5a484182"} Sep 30 07:50:19 crc kubenswrapper[4760]: I0930 07:50:19.263388 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" Sep 30 07:50:19 crc kubenswrapper[4760]: I0930 07:50:19.288914 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" podStartSLOduration=4.288895521 podStartE2EDuration="4.288895521s" podCreationTimestamp="2025-09-30 07:50:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:50:19.288623304 +0000 UTC m=+1004.931529716" watchObservedRunningTime="2025-09-30 07:50:19.288895521 +0000 UTC m=+1004.931801933" Sep 30 07:50:20 crc kubenswrapper[4760]: I0930 07:50:20.303949 4760 generic.go:334] "Generic (PLEG): container finished" podID="cf25d474-c105-4c8b-87ad-0911e245056f" containerID="8ff322d29972321c37183515cd622af66909c70fa1837cb7b3f96ab838b1554a" exitCode=0 Sep 30 07:50:20 crc kubenswrapper[4760]: I0930 07:50:20.304480 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-mpstv" event={"ID":"cf25d474-c105-4c8b-87ad-0911e245056f","Type":"ContainerDied","Data":"8ff322d29972321c37183515cd622af66909c70fa1837cb7b3f96ab838b1554a"} Sep 30 07:50:21 crc kubenswrapper[4760]: I0930 07:50:21.330660 4760 generic.go:334] "Generic (PLEG): container finished" podID="e00788e1-5a38-4884-8884-1be4c2ceca22" containerID="6ec9d5a22f841fdef7de549baa0a72a7a08951cc585d48d88bdbfc47d7ac96e1" exitCode=0 Sep 30 07:50:21 crc kubenswrapper[4760]: I0930 07:50:21.330732 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kbxt8" event={"ID":"e00788e1-5a38-4884-8884-1be4c2ceca22","Type":"ContainerDied","Data":"6ec9d5a22f841fdef7de549baa0a72a7a08951cc585d48d88bdbfc47d7ac96e1"} Sep 30 07:50:22 crc kubenswrapper[4760]: I0930 07:50:22.756512 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2f94-account-create-v8rw4" Sep 30 07:50:22 crc kubenswrapper[4760]: I0930 07:50:22.762897 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b248-account-create-rd7nr" Sep 30 07:50:22 crc kubenswrapper[4760]: I0930 07:50:22.779159 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-mpstv" Sep 30 07:50:22 crc kubenswrapper[4760]: I0930 07:50:22.785854 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4a5c-account-create-74j84" Sep 30 07:50:22 crc kubenswrapper[4760]: I0930 07:50:22.874904 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4mwg\" (UniqueName: \"kubernetes.io/projected/218f5d61-bfb1-4609-9296-4a5b6471ea56-kube-api-access-r4mwg\") pod \"218f5d61-bfb1-4609-9296-4a5b6471ea56\" (UID: \"218f5d61-bfb1-4609-9296-4a5b6471ea56\") " Sep 30 07:50:22 crc kubenswrapper[4760]: I0930 07:50:22.874974 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsn9q\" (UniqueName: \"kubernetes.io/projected/a9ce45c7-d994-44b1-9f88-297bac4ae9c2-kube-api-access-vsn9q\") pod \"a9ce45c7-d994-44b1-9f88-297bac4ae9c2\" (UID: \"a9ce45c7-d994-44b1-9f88-297bac4ae9c2\") " Sep 30 07:50:22 crc kubenswrapper[4760]: I0930 07:50:22.883824 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/218f5d61-bfb1-4609-9296-4a5b6471ea56-kube-api-access-r4mwg" (OuterVolumeSpecName: "kube-api-access-r4mwg") pod "218f5d61-bfb1-4609-9296-4a5b6471ea56" (UID: "218f5d61-bfb1-4609-9296-4a5b6471ea56"). InnerVolumeSpecName "kube-api-access-r4mwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:50:22 crc kubenswrapper[4760]: I0930 07:50:22.884577 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9ce45c7-d994-44b1-9f88-297bac4ae9c2-kube-api-access-vsn9q" (OuterVolumeSpecName: "kube-api-access-vsn9q") pod "a9ce45c7-d994-44b1-9f88-297bac4ae9c2" (UID: "a9ce45c7-d994-44b1-9f88-297bac4ae9c2"). InnerVolumeSpecName "kube-api-access-vsn9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:50:22 crc kubenswrapper[4760]: I0930 07:50:22.976589 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nplf9\" (UniqueName: \"kubernetes.io/projected/7390084a-0022-4943-914f-fdb71e7ec326-kube-api-access-nplf9\") pod \"7390084a-0022-4943-914f-fdb71e7ec326\" (UID: \"7390084a-0022-4943-914f-fdb71e7ec326\") " Sep 30 07:50:22 crc kubenswrapper[4760]: I0930 07:50:22.976856 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2v22\" (UniqueName: \"kubernetes.io/projected/cf25d474-c105-4c8b-87ad-0911e245056f-kube-api-access-c2v22\") pod \"cf25d474-c105-4c8b-87ad-0911e245056f\" (UID: \"cf25d474-c105-4c8b-87ad-0911e245056f\") " Sep 30 07:50:22 crc kubenswrapper[4760]: I0930 07:50:22.978679 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cf25d474-c105-4c8b-87ad-0911e245056f-db-sync-config-data\") pod \"cf25d474-c105-4c8b-87ad-0911e245056f\" (UID: \"cf25d474-c105-4c8b-87ad-0911e245056f\") " Sep 30 07:50:22 crc kubenswrapper[4760]: I0930 07:50:22.979428 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf25d474-c105-4c8b-87ad-0911e245056f-combined-ca-bundle\") pod \"cf25d474-c105-4c8b-87ad-0911e245056f\" (UID: \"cf25d474-c105-4c8b-87ad-0911e245056f\") " Sep 30 07:50:22 crc kubenswrapper[4760]: I0930 07:50:22.979910 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf25d474-c105-4c8b-87ad-0911e245056f-config-data\") pod \"cf25d474-c105-4c8b-87ad-0911e245056f\" (UID: \"cf25d474-c105-4c8b-87ad-0911e245056f\") " Sep 30 07:50:22 crc kubenswrapper[4760]: I0930 07:50:22.982093 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf25d474-c105-4c8b-87ad-0911e245056f-kube-api-access-c2v22" (OuterVolumeSpecName: "kube-api-access-c2v22") pod "cf25d474-c105-4c8b-87ad-0911e245056f" (UID: "cf25d474-c105-4c8b-87ad-0911e245056f"). InnerVolumeSpecName "kube-api-access-c2v22". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:50:22 crc kubenswrapper[4760]: I0930 07:50:22.982555 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7390084a-0022-4943-914f-fdb71e7ec326-kube-api-access-nplf9" (OuterVolumeSpecName: "kube-api-access-nplf9") pod "7390084a-0022-4943-914f-fdb71e7ec326" (UID: "7390084a-0022-4943-914f-fdb71e7ec326"). InnerVolumeSpecName "kube-api-access-nplf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:50:22 crc kubenswrapper[4760]: I0930 07:50:22.983751 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nplf9\" (UniqueName: \"kubernetes.io/projected/7390084a-0022-4943-914f-fdb71e7ec326-kube-api-access-nplf9\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:22 crc kubenswrapper[4760]: I0930 07:50:22.983772 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4mwg\" (UniqueName: \"kubernetes.io/projected/218f5d61-bfb1-4609-9296-4a5b6471ea56-kube-api-access-r4mwg\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:22 crc kubenswrapper[4760]: I0930 07:50:22.983783 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsn9q\" (UniqueName: \"kubernetes.io/projected/a9ce45c7-d994-44b1-9f88-297bac4ae9c2-kube-api-access-vsn9q\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:22 crc kubenswrapper[4760]: I0930 07:50:22.983791 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2v22\" (UniqueName: \"kubernetes.io/projected/cf25d474-c105-4c8b-87ad-0911e245056f-kube-api-access-c2v22\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:22 crc kubenswrapper[4760]: I0930 07:50:22.988845 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf25d474-c105-4c8b-87ad-0911e245056f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "cf25d474-c105-4c8b-87ad-0911e245056f" (UID: "cf25d474-c105-4c8b-87ad-0911e245056f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:23 crc kubenswrapper[4760]: I0930 07:50:23.023361 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf25d474-c105-4c8b-87ad-0911e245056f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf25d474-c105-4c8b-87ad-0911e245056f" (UID: "cf25d474-c105-4c8b-87ad-0911e245056f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:23 crc kubenswrapper[4760]: I0930 07:50:23.039730 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf25d474-c105-4c8b-87ad-0911e245056f-config-data" (OuterVolumeSpecName: "config-data") pod "cf25d474-c105-4c8b-87ad-0911e245056f" (UID: "cf25d474-c105-4c8b-87ad-0911e245056f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:23 crc kubenswrapper[4760]: I0930 07:50:23.085070 4760 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cf25d474-c105-4c8b-87ad-0911e245056f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:23 crc kubenswrapper[4760]: I0930 07:50:23.085101 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf25d474-c105-4c8b-87ad-0911e245056f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:23 crc kubenswrapper[4760]: I0930 07:50:23.085110 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf25d474-c105-4c8b-87ad-0911e245056f-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:23 crc kubenswrapper[4760]: I0930 07:50:23.352164 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-mpstv" event={"ID":"cf25d474-c105-4c8b-87ad-0911e245056f","Type":"ContainerDied","Data":"16cf11062dda106ae6ba38f1614f80fc9c1495636e515b480cd187686398716f"} Sep 30 07:50:23 crc kubenswrapper[4760]: I0930 07:50:23.352444 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16cf11062dda106ae6ba38f1614f80fc9c1495636e515b480cd187686398716f" Sep 30 07:50:23 crc kubenswrapper[4760]: I0930 07:50:23.352211 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-mpstv" Sep 30 07:50:23 crc kubenswrapper[4760]: I0930 07:50:23.356318 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b248-account-create-rd7nr" event={"ID":"a9ce45c7-d994-44b1-9f88-297bac4ae9c2","Type":"ContainerDied","Data":"53c275d4e2bc0486ef1a0bf5ed0064e6f6249ac73611a07090a2e1e5d14efaaa"} Sep 30 07:50:23 crc kubenswrapper[4760]: I0930 07:50:23.356351 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53c275d4e2bc0486ef1a0bf5ed0064e6f6249ac73611a07090a2e1e5d14efaaa" Sep 30 07:50:23 crc kubenswrapper[4760]: I0930 07:50:23.356331 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b248-account-create-rd7nr" Sep 30 07:50:23 crc kubenswrapper[4760]: I0930 07:50:23.358166 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4a5c-account-create-74j84" Sep 30 07:50:23 crc kubenswrapper[4760]: I0930 07:50:23.358255 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4a5c-account-create-74j84" event={"ID":"7390084a-0022-4943-914f-fdb71e7ec326","Type":"ContainerDied","Data":"e56125de8cb376a58e19ff7fb20ed782e15de7523a498adbb8c0fb68672cad7d"} Sep 30 07:50:23 crc kubenswrapper[4760]: I0930 07:50:23.358330 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e56125de8cb376a58e19ff7fb20ed782e15de7523a498adbb8c0fb68672cad7d" Sep 30 07:50:23 crc kubenswrapper[4760]: I0930 07:50:23.364054 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2f94-account-create-v8rw4" event={"ID":"218f5d61-bfb1-4609-9296-4a5b6471ea56","Type":"ContainerDied","Data":"2b052913dc5bbbaabdeea224845841467dc9c1c40b91e66b2bbb7da3f4a724e6"} Sep 30 07:50:23 crc kubenswrapper[4760]: I0930 07:50:23.364083 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b052913dc5bbbaabdeea224845841467dc9c1c40b91e66b2bbb7da3f4a724e6" Sep 30 07:50:23 crc kubenswrapper[4760]: I0930 07:50:23.364106 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2f94-account-create-v8rw4" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.045963 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Sep 30 07:50:24 crc kubenswrapper[4760]: E0930 07:50:24.051373 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7390084a-0022-4943-914f-fdb71e7ec326" containerName="mariadb-account-create" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.051415 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="7390084a-0022-4943-914f-fdb71e7ec326" containerName="mariadb-account-create" Sep 30 07:50:24 crc kubenswrapper[4760]: E0930 07:50:24.051443 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf25d474-c105-4c8b-87ad-0911e245056f" containerName="watcher-db-sync" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.051455 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf25d474-c105-4c8b-87ad-0911e245056f" containerName="watcher-db-sync" Sep 30 07:50:24 crc kubenswrapper[4760]: E0930 07:50:24.051480 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9ce45c7-d994-44b1-9f88-297bac4ae9c2" containerName="mariadb-account-create" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.051488 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9ce45c7-d994-44b1-9f88-297bac4ae9c2" containerName="mariadb-account-create" Sep 30 07:50:24 crc kubenswrapper[4760]: E0930 07:50:24.051508 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="218f5d61-bfb1-4609-9296-4a5b6471ea56" containerName="mariadb-account-create" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.051514 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="218f5d61-bfb1-4609-9296-4a5b6471ea56" containerName="mariadb-account-create" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.051855 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9ce45c7-d994-44b1-9f88-297bac4ae9c2" containerName="mariadb-account-create" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.051878 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="218f5d61-bfb1-4609-9296-4a5b6471ea56" containerName="mariadb-account-create" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.051906 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf25d474-c105-4c8b-87ad-0911e245056f" containerName="watcher-db-sync" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.051919 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="7390084a-0022-4943-914f-fdb71e7ec326" containerName="mariadb-account-create" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.052842 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.058034 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-blgm4" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.058051 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.069811 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.080730 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.082223 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.091414 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.107120 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.111731 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-config-data\") pod \"watcher-api-0\" (UID: \"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5\") " pod="openstack/watcher-api-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.111828 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5\") " pod="openstack/watcher-api-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.112202 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-logs\") pod \"watcher-api-0\" (UID: \"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5\") " pod="openstack/watcher-api-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.112296 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6prkr\" (UniqueName: \"kubernetes.io/projected/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-kube-api-access-6prkr\") pod \"watcher-api-0\" (UID: \"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5\") " pod="openstack/watcher-api-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.112380 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5\") " pod="openstack/watcher-api-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.153640 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.155034 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.158057 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.166725 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.213830 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5\") " pod="openstack/watcher-api-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.213892 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-config-data\") pod \"watcher-api-0\" (UID: \"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5\") " pod="openstack/watcher-api-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.213926 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh2p5\" (UniqueName: \"kubernetes.io/projected/bd8762e1-d4b3-4999-996e-db79b881afec-kube-api-access-nh2p5\") pod \"watcher-decision-engine-0\" (UID: \"bd8762e1-d4b3-4999-996e-db79b881afec\") " pod="openstack/watcher-decision-engine-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.213965 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd8762e1-d4b3-4999-996e-db79b881afec-logs\") pod \"watcher-decision-engine-0\" (UID: \"bd8762e1-d4b3-4999-996e-db79b881afec\") " pod="openstack/watcher-decision-engine-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.213996 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5\") " pod="openstack/watcher-api-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.214048 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd8762e1-d4b3-4999-996e-db79b881afec-config-data\") pod \"watcher-decision-engine-0\" (UID: \"bd8762e1-d4b3-4999-996e-db79b881afec\") " pod="openstack/watcher-decision-engine-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.214071 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd8762e1-d4b3-4999-996e-db79b881afec-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"bd8762e1-d4b3-4999-996e-db79b881afec\") " pod="openstack/watcher-decision-engine-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.214088 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-logs\") pod \"watcher-api-0\" (UID: \"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5\") " pod="openstack/watcher-api-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.214117 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bd8762e1-d4b3-4999-996e-db79b881afec-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"bd8762e1-d4b3-4999-996e-db79b881afec\") " pod="openstack/watcher-decision-engine-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.214136 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6prkr\" (UniqueName: \"kubernetes.io/projected/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-kube-api-access-6prkr\") pod \"watcher-api-0\" (UID: \"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5\") " pod="openstack/watcher-api-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.219005 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-logs\") pod \"watcher-api-0\" (UID: \"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5\") " pod="openstack/watcher-api-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.219894 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5\") " pod="openstack/watcher-api-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.220373 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5\") " pod="openstack/watcher-api-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.221123 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-config-data\") pod \"watcher-api-0\" (UID: \"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5\") " pod="openstack/watcher-api-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.249837 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6prkr\" (UniqueName: \"kubernetes.io/projected/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-kube-api-access-6prkr\") pod \"watcher-api-0\" (UID: \"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5\") " pod="openstack/watcher-api-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.316102 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl48b\" (UniqueName: \"kubernetes.io/projected/7c12cbe5-a6fb-4ead-bb65-cd13dab410ce-kube-api-access-fl48b\") pod \"watcher-applier-0\" (UID: \"7c12cbe5-a6fb-4ead-bb65-cd13dab410ce\") " pod="openstack/watcher-applier-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.316150 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh2p5\" (UniqueName: \"kubernetes.io/projected/bd8762e1-d4b3-4999-996e-db79b881afec-kube-api-access-nh2p5\") pod \"watcher-decision-engine-0\" (UID: \"bd8762e1-d4b3-4999-996e-db79b881afec\") " pod="openstack/watcher-decision-engine-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.316178 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd8762e1-d4b3-4999-996e-db79b881afec-logs\") pod \"watcher-decision-engine-0\" (UID: \"bd8762e1-d4b3-4999-996e-db79b881afec\") " pod="openstack/watcher-decision-engine-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.316202 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c12cbe5-a6fb-4ead-bb65-cd13dab410ce-config-data\") pod \"watcher-applier-0\" (UID: \"7c12cbe5-a6fb-4ead-bb65-cd13dab410ce\") " pod="openstack/watcher-applier-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.316268 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd8762e1-d4b3-4999-996e-db79b881afec-config-data\") pod \"watcher-decision-engine-0\" (UID: \"bd8762e1-d4b3-4999-996e-db79b881afec\") " pod="openstack/watcher-decision-engine-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.316293 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd8762e1-d4b3-4999-996e-db79b881afec-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"bd8762e1-d4b3-4999-996e-db79b881afec\") " pod="openstack/watcher-decision-engine-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.316351 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bd8762e1-d4b3-4999-996e-db79b881afec-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"bd8762e1-d4b3-4999-996e-db79b881afec\") " pod="openstack/watcher-decision-engine-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.316385 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c12cbe5-a6fb-4ead-bb65-cd13dab410ce-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"7c12cbe5-a6fb-4ead-bb65-cd13dab410ce\") " pod="openstack/watcher-applier-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.316407 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c12cbe5-a6fb-4ead-bb65-cd13dab410ce-logs\") pod \"watcher-applier-0\" (UID: \"7c12cbe5-a6fb-4ead-bb65-cd13dab410ce\") " pod="openstack/watcher-applier-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.316691 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd8762e1-d4b3-4999-996e-db79b881afec-logs\") pod \"watcher-decision-engine-0\" (UID: \"bd8762e1-d4b3-4999-996e-db79b881afec\") " pod="openstack/watcher-decision-engine-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.326569 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd8762e1-d4b3-4999-996e-db79b881afec-config-data\") pod \"watcher-decision-engine-0\" (UID: \"bd8762e1-d4b3-4999-996e-db79b881afec\") " pod="openstack/watcher-decision-engine-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.327668 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd8762e1-d4b3-4999-996e-db79b881afec-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"bd8762e1-d4b3-4999-996e-db79b881afec\") " pod="openstack/watcher-decision-engine-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.328264 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bd8762e1-d4b3-4999-996e-db79b881afec-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"bd8762e1-d4b3-4999-996e-db79b881afec\") " pod="openstack/watcher-decision-engine-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.334140 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-746f884dcc-8lkjw"] Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.343196 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh2p5\" (UniqueName: \"kubernetes.io/projected/bd8762e1-d4b3-4999-996e-db79b881afec-kube-api-access-nh2p5\") pod \"watcher-decision-engine-0\" (UID: \"bd8762e1-d4b3-4999-996e-db79b881afec\") " pod="openstack/watcher-decision-engine-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.365440 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-694db87c64-qrwhp"] Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.368158 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.368910 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.373584 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-694db87c64-qrwhp"] Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.378101 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.400812 4760 generic.go:334] "Generic (PLEG): container finished" podID="09c9e208-48cc-44b8-9810-fc0cf69cea8a" containerID="1709f319e4cbbd5d960887fc4785bf4c61a3094232f12625c2012c901d224818" exitCode=0 Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.400863 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s45t8" event={"ID":"09c9e208-48cc-44b8-9810-fc0cf69cea8a","Type":"ContainerDied","Data":"1709f319e4cbbd5d960887fc4785bf4c61a3094232f12625c2012c901d224818"} Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.411428 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.413170 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6d547648b9-kxd9p"] Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.418439 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl48b\" (UniqueName: \"kubernetes.io/projected/7c12cbe5-a6fb-4ead-bb65-cd13dab410ce-kube-api-access-fl48b\") pod \"watcher-applier-0\" (UID: \"7c12cbe5-a6fb-4ead-bb65-cd13dab410ce\") " pod="openstack/watcher-applier-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.418507 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c12cbe5-a6fb-4ead-bb65-cd13dab410ce-config-data\") pod \"watcher-applier-0\" (UID: \"7c12cbe5-a6fb-4ead-bb65-cd13dab410ce\") " pod="openstack/watcher-applier-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.420688 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c12cbe5-a6fb-4ead-bb65-cd13dab410ce-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"7c12cbe5-a6fb-4ead-bb65-cd13dab410ce\") " pod="openstack/watcher-applier-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.420814 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c12cbe5-a6fb-4ead-bb65-cd13dab410ce-logs\") pod \"watcher-applier-0\" (UID: \"7c12cbe5-a6fb-4ead-bb65-cd13dab410ce\") " pod="openstack/watcher-applier-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.421526 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c12cbe5-a6fb-4ead-bb65-cd13dab410ce-logs\") pod \"watcher-applier-0\" (UID: \"7c12cbe5-a6fb-4ead-bb65-cd13dab410ce\") " pod="openstack/watcher-applier-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.430064 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c12cbe5-a6fb-4ead-bb65-cd13dab410ce-config-data\") pod \"watcher-applier-0\" (UID: \"7c12cbe5-a6fb-4ead-bb65-cd13dab410ce\") " pod="openstack/watcher-applier-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.439792 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c12cbe5-a6fb-4ead-bb65-cd13dab410ce-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"7c12cbe5-a6fb-4ead-bb65-cd13dab410ce\") " pod="openstack/watcher-applier-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.440845 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl48b\" (UniqueName: \"kubernetes.io/projected/7c12cbe5-a6fb-4ead-bb65-cd13dab410ce-kube-api-access-fl48b\") pod \"watcher-applier-0\" (UID: \"7c12cbe5-a6fb-4ead-bb65-cd13dab410ce\") " pod="openstack/watcher-applier-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.455539 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-75644c8bb4-wrsmv"] Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.457489 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75644c8bb4-wrsmv" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.471342 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75644c8bb4-wrsmv"] Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.483718 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.522943 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzls6\" (UniqueName: \"kubernetes.io/projected/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-kube-api-access-pzls6\") pod \"horizon-694db87c64-qrwhp\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.523065 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-scripts\") pod \"horizon-694db87c64-qrwhp\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.523097 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-logs\") pod \"horizon-694db87c64-qrwhp\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.523138 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-config-data\") pod \"horizon-694db87c64-qrwhp\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.523162 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-horizon-secret-key\") pod \"horizon-694db87c64-qrwhp\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.523358 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-combined-ca-bundle\") pod \"horizon-694db87c64-qrwhp\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.523472 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-horizon-tls-certs\") pod \"horizon-694db87c64-qrwhp\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.625187 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-scripts\") pod \"horizon-694db87c64-qrwhp\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.625485 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-logs\") pod \"horizon-694db87c64-qrwhp\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.625518 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-config-data\") pod \"horizon-694db87c64-qrwhp\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.625547 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b39ba3e-25df-4a22-a1fe-f15e6ca1fada-config-data\") pod \"horizon-75644c8bb4-wrsmv\" (UID: \"8b39ba3e-25df-4a22-a1fe-f15e6ca1fada\") " pod="openstack/horizon-75644c8bb4-wrsmv" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.625564 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-horizon-secret-key\") pod \"horizon-694db87c64-qrwhp\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.625578 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-combined-ca-bundle\") pod \"horizon-694db87c64-qrwhp\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.625607 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b39ba3e-25df-4a22-a1fe-f15e6ca1fada-scripts\") pod \"horizon-75644c8bb4-wrsmv\" (UID: \"8b39ba3e-25df-4a22-a1fe-f15e6ca1fada\") " pod="openstack/horizon-75644c8bb4-wrsmv" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.625626 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b39ba3e-25df-4a22-a1fe-f15e6ca1fada-horizon-tls-certs\") pod \"horizon-75644c8bb4-wrsmv\" (UID: \"8b39ba3e-25df-4a22-a1fe-f15e6ca1fada\") " pod="openstack/horizon-75644c8bb4-wrsmv" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.625658 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-horizon-tls-certs\") pod \"horizon-694db87c64-qrwhp\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.625697 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b39ba3e-25df-4a22-a1fe-f15e6ca1fada-combined-ca-bundle\") pod \"horizon-75644c8bb4-wrsmv\" (UID: \"8b39ba3e-25df-4a22-a1fe-f15e6ca1fada\") " pod="openstack/horizon-75644c8bb4-wrsmv" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.625716 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8b39ba3e-25df-4a22-a1fe-f15e6ca1fada-horizon-secret-key\") pod \"horizon-75644c8bb4-wrsmv\" (UID: \"8b39ba3e-25df-4a22-a1fe-f15e6ca1fada\") " pod="openstack/horizon-75644c8bb4-wrsmv" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.625749 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b39ba3e-25df-4a22-a1fe-f15e6ca1fada-logs\") pod \"horizon-75644c8bb4-wrsmv\" (UID: \"8b39ba3e-25df-4a22-a1fe-f15e6ca1fada\") " pod="openstack/horizon-75644c8bb4-wrsmv" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.625777 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzls6\" (UniqueName: \"kubernetes.io/projected/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-kube-api-access-pzls6\") pod \"horizon-694db87c64-qrwhp\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.625795 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhppg\" (UniqueName: \"kubernetes.io/projected/8b39ba3e-25df-4a22-a1fe-f15e6ca1fada-kube-api-access-vhppg\") pod \"horizon-75644c8bb4-wrsmv\" (UID: \"8b39ba3e-25df-4a22-a1fe-f15e6ca1fada\") " pod="openstack/horizon-75644c8bb4-wrsmv" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.626041 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-scripts\") pod \"horizon-694db87c64-qrwhp\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.627586 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-logs\") pod \"horizon-694db87c64-qrwhp\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.628758 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-config-data\") pod \"horizon-694db87c64-qrwhp\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.630040 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-horizon-secret-key\") pod \"horizon-694db87c64-qrwhp\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.631446 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-horizon-tls-certs\") pod \"horizon-694db87c64-qrwhp\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.631955 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-combined-ca-bundle\") pod \"horizon-694db87c64-qrwhp\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.645559 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzls6\" (UniqueName: \"kubernetes.io/projected/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-kube-api-access-pzls6\") pod \"horizon-694db87c64-qrwhp\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.715035 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.727008 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b39ba3e-25df-4a22-a1fe-f15e6ca1fada-logs\") pod \"horizon-75644c8bb4-wrsmv\" (UID: \"8b39ba3e-25df-4a22-a1fe-f15e6ca1fada\") " pod="openstack/horizon-75644c8bb4-wrsmv" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.727082 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhppg\" (UniqueName: \"kubernetes.io/projected/8b39ba3e-25df-4a22-a1fe-f15e6ca1fada-kube-api-access-vhppg\") pod \"horizon-75644c8bb4-wrsmv\" (UID: \"8b39ba3e-25df-4a22-a1fe-f15e6ca1fada\") " pod="openstack/horizon-75644c8bb4-wrsmv" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.727884 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b39ba3e-25df-4a22-a1fe-f15e6ca1fada-config-data\") pod \"horizon-75644c8bb4-wrsmv\" (UID: \"8b39ba3e-25df-4a22-a1fe-f15e6ca1fada\") " pod="openstack/horizon-75644c8bb4-wrsmv" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.727923 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b39ba3e-25df-4a22-a1fe-f15e6ca1fada-scripts\") pod \"horizon-75644c8bb4-wrsmv\" (UID: \"8b39ba3e-25df-4a22-a1fe-f15e6ca1fada\") " pod="openstack/horizon-75644c8bb4-wrsmv" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.727944 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b39ba3e-25df-4a22-a1fe-f15e6ca1fada-horizon-tls-certs\") pod \"horizon-75644c8bb4-wrsmv\" (UID: \"8b39ba3e-25df-4a22-a1fe-f15e6ca1fada\") " pod="openstack/horizon-75644c8bb4-wrsmv" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.727992 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b39ba3e-25df-4a22-a1fe-f15e6ca1fada-combined-ca-bundle\") pod \"horizon-75644c8bb4-wrsmv\" (UID: \"8b39ba3e-25df-4a22-a1fe-f15e6ca1fada\") " pod="openstack/horizon-75644c8bb4-wrsmv" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.728012 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8b39ba3e-25df-4a22-a1fe-f15e6ca1fada-horizon-secret-key\") pod \"horizon-75644c8bb4-wrsmv\" (UID: \"8b39ba3e-25df-4a22-a1fe-f15e6ca1fada\") " pod="openstack/horizon-75644c8bb4-wrsmv" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.727609 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b39ba3e-25df-4a22-a1fe-f15e6ca1fada-logs\") pod \"horizon-75644c8bb4-wrsmv\" (UID: \"8b39ba3e-25df-4a22-a1fe-f15e6ca1fada\") " pod="openstack/horizon-75644c8bb4-wrsmv" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.728977 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b39ba3e-25df-4a22-a1fe-f15e6ca1fada-scripts\") pod \"horizon-75644c8bb4-wrsmv\" (UID: \"8b39ba3e-25df-4a22-a1fe-f15e6ca1fada\") " pod="openstack/horizon-75644c8bb4-wrsmv" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.729594 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b39ba3e-25df-4a22-a1fe-f15e6ca1fada-config-data\") pod \"horizon-75644c8bb4-wrsmv\" (UID: \"8b39ba3e-25df-4a22-a1fe-f15e6ca1fada\") " pod="openstack/horizon-75644c8bb4-wrsmv" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.731324 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b39ba3e-25df-4a22-a1fe-f15e6ca1fada-horizon-tls-certs\") pod \"horizon-75644c8bb4-wrsmv\" (UID: \"8b39ba3e-25df-4a22-a1fe-f15e6ca1fada\") " pod="openstack/horizon-75644c8bb4-wrsmv" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.732172 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8b39ba3e-25df-4a22-a1fe-f15e6ca1fada-horizon-secret-key\") pod \"horizon-75644c8bb4-wrsmv\" (UID: \"8b39ba3e-25df-4a22-a1fe-f15e6ca1fada\") " pod="openstack/horizon-75644c8bb4-wrsmv" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.733979 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b39ba3e-25df-4a22-a1fe-f15e6ca1fada-combined-ca-bundle\") pod \"horizon-75644c8bb4-wrsmv\" (UID: \"8b39ba3e-25df-4a22-a1fe-f15e6ca1fada\") " pod="openstack/horizon-75644c8bb4-wrsmv" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.751784 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhppg\" (UniqueName: \"kubernetes.io/projected/8b39ba3e-25df-4a22-a1fe-f15e6ca1fada-kube-api-access-vhppg\") pod \"horizon-75644c8bb4-wrsmv\" (UID: \"8b39ba3e-25df-4a22-a1fe-f15e6ca1fada\") " pod="openstack/horizon-75644c8bb4-wrsmv" Sep 30 07:50:24 crc kubenswrapper[4760]: I0930 07:50:24.775836 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75644c8bb4-wrsmv" Sep 30 07:50:26 crc kubenswrapper[4760]: I0930 07:50:26.049452 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" Sep 30 07:50:26 crc kubenswrapper[4760]: I0930 07:50:26.107431 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-qd6bn"] Sep 30 07:50:26 crc kubenswrapper[4760]: I0930 07:50:26.107703 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" podUID="a39adebc-2e75-44a7-b8e2-edcd9631e8d3" containerName="dnsmasq-dns" containerID="cri-o://31453f880b5979ac04025f2ff809eb14cfb86d5d44a1f916d96f3fbcc9a7fc11" gracePeriod=10 Sep 30 07:50:26 crc kubenswrapper[4760]: I0930 07:50:26.424042 4760 generic.go:334] "Generic (PLEG): container finished" podID="a39adebc-2e75-44a7-b8e2-edcd9631e8d3" containerID="31453f880b5979ac04025f2ff809eb14cfb86d5d44a1f916d96f3fbcc9a7fc11" exitCode=0 Sep 30 07:50:26 crc kubenswrapper[4760]: I0930 07:50:26.424082 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" event={"ID":"a39adebc-2e75-44a7-b8e2-edcd9631e8d3","Type":"ContainerDied","Data":"31453f880b5979ac04025f2ff809eb14cfb86d5d44a1f916d96f3fbcc9a7fc11"} Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.303760 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-f6j6q"] Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.305118 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f6j6q" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.307914 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-59dqn" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.308101 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.317057 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-f6j6q"] Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.491308 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f75np\" (UniqueName: \"kubernetes.io/projected/1df67641-4598-4ba5-a59a-a195084e5446-kube-api-access-f75np\") pod \"barbican-db-sync-f6j6q\" (UID: \"1df67641-4598-4ba5-a59a-a195084e5446\") " pod="openstack/barbican-db-sync-f6j6q" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.491365 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1df67641-4598-4ba5-a59a-a195084e5446-db-sync-config-data\") pod \"barbican-db-sync-f6j6q\" (UID: \"1df67641-4598-4ba5-a59a-a195084e5446\") " pod="openstack/barbican-db-sync-f6j6q" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.491712 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df67641-4598-4ba5-a59a-a195084e5446-combined-ca-bundle\") pod \"barbican-db-sync-f6j6q\" (UID: \"1df67641-4598-4ba5-a59a-a195084e5446\") " pod="openstack/barbican-db-sync-f6j6q" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.578376 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-9ml2s"] Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.579916 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9ml2s" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.588522 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9ml2s"] Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.589679 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.590075 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-9sl2d" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.590811 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.593166 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df67641-4598-4ba5-a59a-a195084e5446-combined-ca-bundle\") pod \"barbican-db-sync-f6j6q\" (UID: \"1df67641-4598-4ba5-a59a-a195084e5446\") " pod="openstack/barbican-db-sync-f6j6q" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.593279 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f75np\" (UniqueName: \"kubernetes.io/projected/1df67641-4598-4ba5-a59a-a195084e5446-kube-api-access-f75np\") pod \"barbican-db-sync-f6j6q\" (UID: \"1df67641-4598-4ba5-a59a-a195084e5446\") " pod="openstack/barbican-db-sync-f6j6q" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.593354 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1df67641-4598-4ba5-a59a-a195084e5446-db-sync-config-data\") pod \"barbican-db-sync-f6j6q\" (UID: \"1df67641-4598-4ba5-a59a-a195084e5446\") " pod="openstack/barbican-db-sync-f6j6q" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.611434 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f75np\" (UniqueName: \"kubernetes.io/projected/1df67641-4598-4ba5-a59a-a195084e5446-kube-api-access-f75np\") pod \"barbican-db-sync-f6j6q\" (UID: \"1df67641-4598-4ba5-a59a-a195084e5446\") " pod="openstack/barbican-db-sync-f6j6q" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.611761 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df67641-4598-4ba5-a59a-a195084e5446-combined-ca-bundle\") pod \"barbican-db-sync-f6j6q\" (UID: \"1df67641-4598-4ba5-a59a-a195084e5446\") " pod="openstack/barbican-db-sync-f6j6q" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.613708 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1df67641-4598-4ba5-a59a-a195084e5446-db-sync-config-data\") pod \"barbican-db-sync-f6j6q\" (UID: \"1df67641-4598-4ba5-a59a-a195084e5446\") " pod="openstack/barbican-db-sync-f6j6q" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.632332 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f6j6q" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.695677 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-config-data\") pod \"cinder-db-sync-9ml2s\" (UID: \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\") " pod="openstack/cinder-db-sync-9ml2s" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.695791 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-scripts\") pod \"cinder-db-sync-9ml2s\" (UID: \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\") " pod="openstack/cinder-db-sync-9ml2s" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.695881 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-etc-machine-id\") pod \"cinder-db-sync-9ml2s\" (UID: \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\") " pod="openstack/cinder-db-sync-9ml2s" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.696005 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-db-sync-config-data\") pod \"cinder-db-sync-9ml2s\" (UID: \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\") " pod="openstack/cinder-db-sync-9ml2s" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.696031 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-combined-ca-bundle\") pod \"cinder-db-sync-9ml2s\" (UID: \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\") " pod="openstack/cinder-db-sync-9ml2s" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.696073 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tz96\" (UniqueName: \"kubernetes.io/projected/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-kube-api-access-5tz96\") pod \"cinder-db-sync-9ml2s\" (UID: \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\") " pod="openstack/cinder-db-sync-9ml2s" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.801661 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-db-sync-config-data\") pod \"cinder-db-sync-9ml2s\" (UID: \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\") " pod="openstack/cinder-db-sync-9ml2s" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.801709 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-combined-ca-bundle\") pod \"cinder-db-sync-9ml2s\" (UID: \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\") " pod="openstack/cinder-db-sync-9ml2s" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.801783 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tz96\" (UniqueName: \"kubernetes.io/projected/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-kube-api-access-5tz96\") pod \"cinder-db-sync-9ml2s\" (UID: \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\") " pod="openstack/cinder-db-sync-9ml2s" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.801867 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-config-data\") pod \"cinder-db-sync-9ml2s\" (UID: \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\") " pod="openstack/cinder-db-sync-9ml2s" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.801931 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-scripts\") pod \"cinder-db-sync-9ml2s\" (UID: \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\") " pod="openstack/cinder-db-sync-9ml2s" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.802015 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-etc-machine-id\") pod \"cinder-db-sync-9ml2s\" (UID: \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\") " pod="openstack/cinder-db-sync-9ml2s" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.802106 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-etc-machine-id\") pod \"cinder-db-sync-9ml2s\" (UID: \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\") " pod="openstack/cinder-db-sync-9ml2s" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.808254 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-db-sync-config-data\") pod \"cinder-db-sync-9ml2s\" (UID: \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\") " pod="openstack/cinder-db-sync-9ml2s" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.808772 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-combined-ca-bundle\") pod \"cinder-db-sync-9ml2s\" (UID: \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\") " pod="openstack/cinder-db-sync-9ml2s" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.810322 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-config-data\") pod \"cinder-db-sync-9ml2s\" (UID: \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\") " pod="openstack/cinder-db-sync-9ml2s" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.824789 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-scripts\") pod \"cinder-db-sync-9ml2s\" (UID: \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\") " pod="openstack/cinder-db-sync-9ml2s" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.828847 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tz96\" (UniqueName: \"kubernetes.io/projected/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-kube-api-access-5tz96\") pod \"cinder-db-sync-9ml2s\" (UID: \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\") " pod="openstack/cinder-db-sync-9ml2s" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.833858 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-8vvnb"] Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.835120 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8vvnb" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.842989 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.843194 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.843321 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-mscbd" Sep 30 07:50:27 crc kubenswrapper[4760]: I0930 07:50:27.863504 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-8vvnb"] Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.006015 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9ml2s" Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.006061 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq6fn\" (UniqueName: \"kubernetes.io/projected/108a4c03-5bd3-45d0-a13d-b67e01bd7654-kube-api-access-pq6fn\") pod \"neutron-db-sync-8vvnb\" (UID: \"108a4c03-5bd3-45d0-a13d-b67e01bd7654\") " pod="openstack/neutron-db-sync-8vvnb" Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.006484 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/108a4c03-5bd3-45d0-a13d-b67e01bd7654-combined-ca-bundle\") pod \"neutron-db-sync-8vvnb\" (UID: \"108a4c03-5bd3-45d0-a13d-b67e01bd7654\") " pod="openstack/neutron-db-sync-8vvnb" Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.006523 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/108a4c03-5bd3-45d0-a13d-b67e01bd7654-config\") pod \"neutron-db-sync-8vvnb\" (UID: \"108a4c03-5bd3-45d0-a13d-b67e01bd7654\") " pod="openstack/neutron-db-sync-8vvnb" Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.108798 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq6fn\" (UniqueName: \"kubernetes.io/projected/108a4c03-5bd3-45d0-a13d-b67e01bd7654-kube-api-access-pq6fn\") pod \"neutron-db-sync-8vvnb\" (UID: \"108a4c03-5bd3-45d0-a13d-b67e01bd7654\") " pod="openstack/neutron-db-sync-8vvnb" Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.108958 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/108a4c03-5bd3-45d0-a13d-b67e01bd7654-combined-ca-bundle\") pod \"neutron-db-sync-8vvnb\" (UID: \"108a4c03-5bd3-45d0-a13d-b67e01bd7654\") " pod="openstack/neutron-db-sync-8vvnb" Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.109042 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/108a4c03-5bd3-45d0-a13d-b67e01bd7654-config\") pod \"neutron-db-sync-8vvnb\" (UID: \"108a4c03-5bd3-45d0-a13d-b67e01bd7654\") " pod="openstack/neutron-db-sync-8vvnb" Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.117396 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/108a4c03-5bd3-45d0-a13d-b67e01bd7654-config\") pod \"neutron-db-sync-8vvnb\" (UID: \"108a4c03-5bd3-45d0-a13d-b67e01bd7654\") " pod="openstack/neutron-db-sync-8vvnb" Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.119350 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/108a4c03-5bd3-45d0-a13d-b67e01bd7654-combined-ca-bundle\") pod \"neutron-db-sync-8vvnb\" (UID: \"108a4c03-5bd3-45d0-a13d-b67e01bd7654\") " pod="openstack/neutron-db-sync-8vvnb" Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.127046 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq6fn\" (UniqueName: \"kubernetes.io/projected/108a4c03-5bd3-45d0-a13d-b67e01bd7654-kube-api-access-pq6fn\") pod \"neutron-db-sync-8vvnb\" (UID: \"108a4c03-5bd3-45d0-a13d-b67e01bd7654\") " pod="openstack/neutron-db-sync-8vvnb" Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.149112 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" podUID="a39adebc-2e75-44a7-b8e2-edcd9631e8d3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: connect: connection refused" Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.210958 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8vvnb" Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.607154 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kbxt8" Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.719751 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-config-data\") pod \"e00788e1-5a38-4884-8884-1be4c2ceca22\" (UID: \"e00788e1-5a38-4884-8884-1be4c2ceca22\") " Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.719854 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-credential-keys\") pod \"e00788e1-5a38-4884-8884-1be4c2ceca22\" (UID: \"e00788e1-5a38-4884-8884-1be4c2ceca22\") " Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.720142 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t6ls\" (UniqueName: \"kubernetes.io/projected/e00788e1-5a38-4884-8884-1be4c2ceca22-kube-api-access-2t6ls\") pod \"e00788e1-5a38-4884-8884-1be4c2ceca22\" (UID: \"e00788e1-5a38-4884-8884-1be4c2ceca22\") " Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.720260 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-combined-ca-bundle\") pod \"e00788e1-5a38-4884-8884-1be4c2ceca22\" (UID: \"e00788e1-5a38-4884-8884-1be4c2ceca22\") " Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.720335 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-scripts\") pod \"e00788e1-5a38-4884-8884-1be4c2ceca22\" (UID: \"e00788e1-5a38-4884-8884-1be4c2ceca22\") " Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.720372 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-fernet-keys\") pod \"e00788e1-5a38-4884-8884-1be4c2ceca22\" (UID: \"e00788e1-5a38-4884-8884-1be4c2ceca22\") " Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.724671 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-scripts" (OuterVolumeSpecName: "scripts") pod "e00788e1-5a38-4884-8884-1be4c2ceca22" (UID: "e00788e1-5a38-4884-8884-1be4c2ceca22"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.724967 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e00788e1-5a38-4884-8884-1be4c2ceca22" (UID: "e00788e1-5a38-4884-8884-1be4c2ceca22"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.725479 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e00788e1-5a38-4884-8884-1be4c2ceca22-kube-api-access-2t6ls" (OuterVolumeSpecName: "kube-api-access-2t6ls") pod "e00788e1-5a38-4884-8884-1be4c2ceca22" (UID: "e00788e1-5a38-4884-8884-1be4c2ceca22"). InnerVolumeSpecName "kube-api-access-2t6ls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.727062 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e00788e1-5a38-4884-8884-1be4c2ceca22" (UID: "e00788e1-5a38-4884-8884-1be4c2ceca22"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.754914 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-config-data" (OuterVolumeSpecName: "config-data") pod "e00788e1-5a38-4884-8884-1be4c2ceca22" (UID: "e00788e1-5a38-4884-8884-1be4c2ceca22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.772605 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e00788e1-5a38-4884-8884-1be4c2ceca22" (UID: "e00788e1-5a38-4884-8884-1be4c2ceca22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.823587 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t6ls\" (UniqueName: \"kubernetes.io/projected/e00788e1-5a38-4884-8884-1be4c2ceca22-kube-api-access-2t6ls\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.823636 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.823657 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.823674 4760 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.823691 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:28 crc kubenswrapper[4760]: I0930 07:50:28.823708 4760 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e00788e1-5a38-4884-8884-1be4c2ceca22-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:29 crc kubenswrapper[4760]: I0930 07:50:29.462557 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kbxt8" event={"ID":"e00788e1-5a38-4884-8884-1be4c2ceca22","Type":"ContainerDied","Data":"b342b1719847c9e5aa478f7082d28ca03eb5a63858d30d41a1095edef6c76899"} Sep 30 07:50:29 crc kubenswrapper[4760]: I0930 07:50:29.462874 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b342b1719847c9e5aa478f7082d28ca03eb5a63858d30d41a1095edef6c76899" Sep 30 07:50:29 crc kubenswrapper[4760]: I0930 07:50:29.462575 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kbxt8" Sep 30 07:50:29 crc kubenswrapper[4760]: I0930 07:50:29.722459 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-kbxt8"] Sep 30 07:50:29 crc kubenswrapper[4760]: I0930 07:50:29.730655 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-kbxt8"] Sep 30 07:50:29 crc kubenswrapper[4760]: I0930 07:50:29.819918 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-6g2pm"] Sep 30 07:50:29 crc kubenswrapper[4760]: E0930 07:50:29.820288 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e00788e1-5a38-4884-8884-1be4c2ceca22" containerName="keystone-bootstrap" Sep 30 07:50:29 crc kubenswrapper[4760]: I0930 07:50:29.820320 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="e00788e1-5a38-4884-8884-1be4c2ceca22" containerName="keystone-bootstrap" Sep 30 07:50:29 crc kubenswrapper[4760]: I0930 07:50:29.820513 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="e00788e1-5a38-4884-8884-1be4c2ceca22" containerName="keystone-bootstrap" Sep 30 07:50:29 crc kubenswrapper[4760]: I0930 07:50:29.821093 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6g2pm" Sep 30 07:50:29 crc kubenswrapper[4760]: I0930 07:50:29.825547 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 07:50:29 crc kubenswrapper[4760]: I0930 07:50:29.826624 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 07:50:29 crc kubenswrapper[4760]: I0930 07:50:29.827234 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 07:50:29 crc kubenswrapper[4760]: I0930 07:50:29.828097 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-gx9h7" Sep 30 07:50:29 crc kubenswrapper[4760]: I0930 07:50:29.836507 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6g2pm"] Sep 30 07:50:29 crc kubenswrapper[4760]: I0930 07:50:29.946222 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-combined-ca-bundle\") pod \"keystone-bootstrap-6g2pm\" (UID: \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\") " pod="openstack/keystone-bootstrap-6g2pm" Sep 30 07:50:29 crc kubenswrapper[4760]: I0930 07:50:29.946383 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-config-data\") pod \"keystone-bootstrap-6g2pm\" (UID: \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\") " pod="openstack/keystone-bootstrap-6g2pm" Sep 30 07:50:29 crc kubenswrapper[4760]: I0930 07:50:29.946416 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-credential-keys\") pod \"keystone-bootstrap-6g2pm\" (UID: \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\") " pod="openstack/keystone-bootstrap-6g2pm" Sep 30 07:50:29 crc kubenswrapper[4760]: I0930 07:50:29.946471 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-scripts\") pod \"keystone-bootstrap-6g2pm\" (UID: \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\") " pod="openstack/keystone-bootstrap-6g2pm" Sep 30 07:50:29 crc kubenswrapper[4760]: I0930 07:50:29.946502 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vf4z\" (UniqueName: \"kubernetes.io/projected/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-kube-api-access-7vf4z\") pod \"keystone-bootstrap-6g2pm\" (UID: \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\") " pod="openstack/keystone-bootstrap-6g2pm" Sep 30 07:50:29 crc kubenswrapper[4760]: I0930 07:50:29.946650 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-fernet-keys\") pod \"keystone-bootstrap-6g2pm\" (UID: \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\") " pod="openstack/keystone-bootstrap-6g2pm" Sep 30 07:50:30 crc kubenswrapper[4760]: I0930 07:50:30.049118 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-config-data\") pod \"keystone-bootstrap-6g2pm\" (UID: \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\") " pod="openstack/keystone-bootstrap-6g2pm" Sep 30 07:50:30 crc kubenswrapper[4760]: I0930 07:50:30.049172 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-credential-keys\") pod \"keystone-bootstrap-6g2pm\" (UID: \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\") " pod="openstack/keystone-bootstrap-6g2pm" Sep 30 07:50:30 crc kubenswrapper[4760]: I0930 07:50:30.049222 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-scripts\") pod \"keystone-bootstrap-6g2pm\" (UID: \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\") " pod="openstack/keystone-bootstrap-6g2pm" Sep 30 07:50:30 crc kubenswrapper[4760]: I0930 07:50:30.049262 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vf4z\" (UniqueName: \"kubernetes.io/projected/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-kube-api-access-7vf4z\") pod \"keystone-bootstrap-6g2pm\" (UID: \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\") " pod="openstack/keystone-bootstrap-6g2pm" Sep 30 07:50:30 crc kubenswrapper[4760]: I0930 07:50:30.049342 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-fernet-keys\") pod \"keystone-bootstrap-6g2pm\" (UID: \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\") " pod="openstack/keystone-bootstrap-6g2pm" Sep 30 07:50:30 crc kubenswrapper[4760]: I0930 07:50:30.049428 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-combined-ca-bundle\") pod \"keystone-bootstrap-6g2pm\" (UID: \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\") " pod="openstack/keystone-bootstrap-6g2pm" Sep 30 07:50:30 crc kubenswrapper[4760]: I0930 07:50:30.055794 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-config-data\") pod \"keystone-bootstrap-6g2pm\" (UID: \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\") " pod="openstack/keystone-bootstrap-6g2pm" Sep 30 07:50:30 crc kubenswrapper[4760]: I0930 07:50:30.056468 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-fernet-keys\") pod \"keystone-bootstrap-6g2pm\" (UID: \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\") " pod="openstack/keystone-bootstrap-6g2pm" Sep 30 07:50:30 crc kubenswrapper[4760]: I0930 07:50:30.057174 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-combined-ca-bundle\") pod \"keystone-bootstrap-6g2pm\" (UID: \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\") " pod="openstack/keystone-bootstrap-6g2pm" Sep 30 07:50:30 crc kubenswrapper[4760]: I0930 07:50:30.058816 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-scripts\") pod \"keystone-bootstrap-6g2pm\" (UID: \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\") " pod="openstack/keystone-bootstrap-6g2pm" Sep 30 07:50:30 crc kubenswrapper[4760]: I0930 07:50:30.069024 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-credential-keys\") pod \"keystone-bootstrap-6g2pm\" (UID: \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\") " pod="openstack/keystone-bootstrap-6g2pm" Sep 30 07:50:30 crc kubenswrapper[4760]: I0930 07:50:30.071848 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vf4z\" (UniqueName: \"kubernetes.io/projected/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-kube-api-access-7vf4z\") pod \"keystone-bootstrap-6g2pm\" (UID: \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\") " pod="openstack/keystone-bootstrap-6g2pm" Sep 30 07:50:30 crc kubenswrapper[4760]: I0930 07:50:30.146668 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6g2pm" Sep 30 07:50:31 crc kubenswrapper[4760]: I0930 07:50:31.079073 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e00788e1-5a38-4884-8884-1be4c2ceca22" path="/var/lib/kubelet/pods/e00788e1-5a38-4884-8884-1be4c2ceca22/volumes" Sep 30 07:50:31 crc kubenswrapper[4760]: E0930 07:50:31.556184 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Sep 30 07:50:31 crc kubenswrapper[4760]: E0930 07:50:31.556421 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n568h574h68h6h647h67fhfdh664h6h4h698hcbh675h76h6fh5d4hcch6fh5bhb4h545hb4h68bh568h54fh5f7hf6h67bhddhfh646h5b5q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9g8dc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-746f884dcc-8lkjw_openstack(f8f42596-0ba9-41d9-a780-b8b3705b963c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 07:50:31 crc kubenswrapper[4760]: I0930 07:50:31.681039 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s45t8" Sep 30 07:50:31 crc kubenswrapper[4760]: I0930 07:50:31.786635 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09c9e208-48cc-44b8-9810-fc0cf69cea8a-combined-ca-bundle\") pod \"09c9e208-48cc-44b8-9810-fc0cf69cea8a\" (UID: \"09c9e208-48cc-44b8-9810-fc0cf69cea8a\") " Sep 30 07:50:31 crc kubenswrapper[4760]: I0930 07:50:31.786700 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/09c9e208-48cc-44b8-9810-fc0cf69cea8a-db-sync-config-data\") pod \"09c9e208-48cc-44b8-9810-fc0cf69cea8a\" (UID: \"09c9e208-48cc-44b8-9810-fc0cf69cea8a\") " Sep 30 07:50:31 crc kubenswrapper[4760]: I0930 07:50:31.786758 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwqgz\" (UniqueName: \"kubernetes.io/projected/09c9e208-48cc-44b8-9810-fc0cf69cea8a-kube-api-access-dwqgz\") pod \"09c9e208-48cc-44b8-9810-fc0cf69cea8a\" (UID: \"09c9e208-48cc-44b8-9810-fc0cf69cea8a\") " Sep 30 07:50:31 crc kubenswrapper[4760]: I0930 07:50:31.786848 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09c9e208-48cc-44b8-9810-fc0cf69cea8a-config-data\") pod \"09c9e208-48cc-44b8-9810-fc0cf69cea8a\" (UID: \"09c9e208-48cc-44b8-9810-fc0cf69cea8a\") " Sep 30 07:50:31 crc kubenswrapper[4760]: I0930 07:50:31.793740 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09c9e208-48cc-44b8-9810-fc0cf69cea8a-kube-api-access-dwqgz" (OuterVolumeSpecName: "kube-api-access-dwqgz") pod "09c9e208-48cc-44b8-9810-fc0cf69cea8a" (UID: "09c9e208-48cc-44b8-9810-fc0cf69cea8a"). InnerVolumeSpecName "kube-api-access-dwqgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:50:31 crc kubenswrapper[4760]: I0930 07:50:31.793754 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09c9e208-48cc-44b8-9810-fc0cf69cea8a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "09c9e208-48cc-44b8-9810-fc0cf69cea8a" (UID: "09c9e208-48cc-44b8-9810-fc0cf69cea8a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:31 crc kubenswrapper[4760]: I0930 07:50:31.829747 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09c9e208-48cc-44b8-9810-fc0cf69cea8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09c9e208-48cc-44b8-9810-fc0cf69cea8a" (UID: "09c9e208-48cc-44b8-9810-fc0cf69cea8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:31 crc kubenswrapper[4760]: I0930 07:50:31.852944 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09c9e208-48cc-44b8-9810-fc0cf69cea8a-config-data" (OuterVolumeSpecName: "config-data") pod "09c9e208-48cc-44b8-9810-fc0cf69cea8a" (UID: "09c9e208-48cc-44b8-9810-fc0cf69cea8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:31 crc kubenswrapper[4760]: I0930 07:50:31.889003 4760 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/09c9e208-48cc-44b8-9810-fc0cf69cea8a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:31 crc kubenswrapper[4760]: I0930 07:50:31.889031 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwqgz\" (UniqueName: \"kubernetes.io/projected/09c9e208-48cc-44b8-9810-fc0cf69cea8a-kube-api-access-dwqgz\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:31 crc kubenswrapper[4760]: I0930 07:50:31.889041 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09c9e208-48cc-44b8-9810-fc0cf69cea8a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:31 crc kubenswrapper[4760]: I0930 07:50:31.889050 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09c9e208-48cc-44b8-9810-fc0cf69cea8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:32 crc kubenswrapper[4760]: E0930 07:50:32.051819 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Sep 30 07:50:32 crc kubenswrapper[4760]: E0930 07:50:32.052016 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n6bh8bh677h569h5c6h665h696h75h677h57bh57fh589h546h56dhcch5c8h56dh54bhbdh5f5h667hbdh5c5h547h5dbh96h5c5h57ch5d5h566h68fh6bq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ggb9w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(7a555f66-7027-4fac-afcc-db7b3f5ae034): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.140757 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.295164 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-dns-swift-storage-0\") pod \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\" (UID: \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\") " Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.295268 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-dns-svc\") pod \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\" (UID: \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\") " Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.295289 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7hl5\" (UniqueName: \"kubernetes.io/projected/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-kube-api-access-h7hl5\") pod \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\" (UID: \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\") " Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.296075 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-ovsdbserver-sb\") pod \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\" (UID: \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\") " Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.296186 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-ovsdbserver-nb\") pod \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\" (UID: \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\") " Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.296261 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-config\") pod \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\" (UID: \"a39adebc-2e75-44a7-b8e2-edcd9631e8d3\") " Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.317756 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-kube-api-access-h7hl5" (OuterVolumeSpecName: "kube-api-access-h7hl5") pod "a39adebc-2e75-44a7-b8e2-edcd9631e8d3" (UID: "a39adebc-2e75-44a7-b8e2-edcd9631e8d3"). InnerVolumeSpecName "kube-api-access-h7hl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.350764 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a39adebc-2e75-44a7-b8e2-edcd9631e8d3" (UID: "a39adebc-2e75-44a7-b8e2-edcd9631e8d3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.355847 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a39adebc-2e75-44a7-b8e2-edcd9631e8d3" (UID: "a39adebc-2e75-44a7-b8e2-edcd9631e8d3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.365379 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-config" (OuterVolumeSpecName: "config") pod "a39adebc-2e75-44a7-b8e2-edcd9631e8d3" (UID: "a39adebc-2e75-44a7-b8e2-edcd9631e8d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.370236 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a39adebc-2e75-44a7-b8e2-edcd9631e8d3" (UID: "a39adebc-2e75-44a7-b8e2-edcd9631e8d3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.401148 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a39adebc-2e75-44a7-b8e2-edcd9631e8d3" (UID: "a39adebc-2e75-44a7-b8e2-edcd9631e8d3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.402348 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.402365 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.402374 4760 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.402383 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.402391 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7hl5\" (UniqueName: \"kubernetes.io/projected/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-kube-api-access-h7hl5\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.402594 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a39adebc-2e75-44a7-b8e2-edcd9631e8d3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.501853 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" event={"ID":"a39adebc-2e75-44a7-b8e2-edcd9631e8d3","Type":"ContainerDied","Data":"59e1bf1f8abe56bf050028140d1422871705417cfcb19caa27ba2e0bd051d36e"} Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.501904 4760 scope.go:117] "RemoveContainer" containerID="31453f880b5979ac04025f2ff809eb14cfb86d5d44a1f916d96f3fbcc9a7fc11" Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.502069 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-qd6bn" Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.510370 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s45t8" event={"ID":"09c9e208-48cc-44b8-9810-fc0cf69cea8a","Type":"ContainerDied","Data":"9edb36d37d1de1d060c04d48b325e55d5eb826acf2c9a3f57c632746c53371c8"} Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.510555 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9edb36d37d1de1d060c04d48b325e55d5eb826acf2c9a3f57c632746c53371c8" Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.510410 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s45t8" Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.545240 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-qd6bn"] Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.546454 4760 scope.go:117] "RemoveContainer" containerID="11203f7700b1de680e69bf58197c15839c9ed92399361c6b6c1e024869ccd8c0" Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.551610 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-qd6bn"] Sep 30 07:50:32 crc kubenswrapper[4760]: E0930 07:50:32.574026 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/horizon-746f884dcc-8lkjw" podUID="f8f42596-0ba9-41d9-a780-b8b3705b963c" Sep 30 07:50:32 crc kubenswrapper[4760]: I0930 07:50:32.714637 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.110716 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a39adebc-2e75-44a7-b8e2-edcd9631e8d3" path="/var/lib/kubelet/pods/a39adebc-2e75-44a7-b8e2-edcd9631e8d3/volumes" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.111466 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-f6j6q"] Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.111490 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-fl2rf"] Sep 30 07:50:33 crc kubenswrapper[4760]: E0930 07:50:33.111815 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a39adebc-2e75-44a7-b8e2-edcd9631e8d3" containerName="dnsmasq-dns" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.111827 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="a39adebc-2e75-44a7-b8e2-edcd9631e8d3" containerName="dnsmasq-dns" Sep 30 07:50:33 crc kubenswrapper[4760]: E0930 07:50:33.111838 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a39adebc-2e75-44a7-b8e2-edcd9631e8d3" containerName="init" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.111844 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="a39adebc-2e75-44a7-b8e2-edcd9631e8d3" containerName="init" Sep 30 07:50:33 crc kubenswrapper[4760]: E0930 07:50:33.111883 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09c9e208-48cc-44b8-9810-fc0cf69cea8a" containerName="glance-db-sync" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.111890 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="09c9e208-48cc-44b8-9810-fc0cf69cea8a" containerName="glance-db-sync" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.112056 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="a39adebc-2e75-44a7-b8e2-edcd9631e8d3" containerName="dnsmasq-dns" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.112065 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="09c9e208-48cc-44b8-9810-fc0cf69cea8a" containerName="glance-db-sync" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.113615 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.144739 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-fl2rf"] Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.179136 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-694db87c64-qrwhp"] Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.222324 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-config\") pod \"dnsmasq-dns-57c957c4ff-fl2rf\" (UID: \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\") " pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.222374 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-fl2rf\" (UID: \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\") " pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.222425 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-fl2rf\" (UID: \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\") " pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.222489 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-fl2rf\" (UID: \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\") " pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.222510 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn6w7\" (UniqueName: \"kubernetes.io/projected/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-kube-api-access-hn6w7\") pod \"dnsmasq-dns-57c957c4ff-fl2rf\" (UID: \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\") " pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.222603 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-fl2rf\" (UID: \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\") " pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.248015 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75644c8bb4-wrsmv"] Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.287655 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.325921 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-fl2rf\" (UID: \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\") " pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.326000 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-config\") pod \"dnsmasq-dns-57c957c4ff-fl2rf\" (UID: \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\") " pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.326048 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-fl2rf\" (UID: \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\") " pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.326086 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-fl2rf\" (UID: \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\") " pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.326127 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-fl2rf\" (UID: \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\") " pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.326148 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn6w7\" (UniqueName: \"kubernetes.io/projected/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-kube-api-access-hn6w7\") pod \"dnsmasq-dns-57c957c4ff-fl2rf\" (UID: \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\") " pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.327018 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-fl2rf\" (UID: \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\") " pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.327622 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-config\") pod \"dnsmasq-dns-57c957c4ff-fl2rf\" (UID: \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\") " pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.328181 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-fl2rf\" (UID: \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\") " pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.358190 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-fl2rf\" (UID: \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\") " pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.364935 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-fl2rf\" (UID: \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\") " pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.382492 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn6w7\" (UniqueName: \"kubernetes.io/projected/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-kube-api-access-hn6w7\") pod \"dnsmasq-dns-57c957c4ff-fl2rf\" (UID: \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\") " pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.438422 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.478422 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9ml2s"] Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.494249 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 07:50:33 crc kubenswrapper[4760]: W0930 07:50:33.506381 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a2c7f25_ec97_4cb0_8475_b7b8d4be47d0.slice/crio-3a0fec3394842434671bbab30568c21e245a9fc2c3df39d9eb043f8fd3a8591b WatchSource:0}: Error finding container 3a0fec3394842434671bbab30568c21e245a9fc2c3df39d9eb043f8fd3a8591b: Status 404 returned error can't find the container with id 3a0fec3394842434671bbab30568c21e245a9fc2c3df39d9eb043f8fd3a8591b Sep 30 07:50:33 crc kubenswrapper[4760]: W0930 07:50:33.507428 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd8762e1_d4b3_4999_996e_db79b881afec.slice/crio-f89cab0426974564e60181e9bbc821f482eb63d851410f3e9f8e576e886823d2 WatchSource:0}: Error finding container f89cab0426974564e60181e9bbc821f482eb63d851410f3e9f8e576e886823d2: Status 404 returned error can't find the container with id f89cab0426974564e60181e9bbc821f482eb63d851410f3e9f8e576e886823d2 Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.525735 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-8vvnb"] Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.542204 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6g2pm"] Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.551440 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9ml2s" event={"ID":"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0","Type":"ContainerStarted","Data":"3a0fec3394842434671bbab30568c21e245a9fc2c3df39d9eb043f8fd3a8591b"} Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.553869 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2npfw" event={"ID":"4584a821-629e-4246-95d3-b84160a0f46c","Type":"ContainerStarted","Data":"7ff8179eb22158ab5265a3f91f89d11c80bc5c5c69813ac725d688b5bbaf9c3c"} Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.556334 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f6j6q" event={"ID":"1df67641-4598-4ba5-a59a-a195084e5446","Type":"ContainerStarted","Data":"7b95bf520c48586f4386da3ad833eb819f74f25e4d61c4e176a5cc33f365f080"} Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.558323 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-746f884dcc-8lkjw" event={"ID":"f8f42596-0ba9-41d9-a780-b8b3705b963c","Type":"ContainerStarted","Data":"0bdb08b78789e281874d3f19555926dccd223d02905679bb3a399172af61108a"} Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.558479 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-746f884dcc-8lkjw" podUID="f8f42596-0ba9-41d9-a780-b8b3705b963c" containerName="horizon" containerID="cri-o://0bdb08b78789e281874d3f19555926dccd223d02905679bb3a399172af61108a" gracePeriod=30 Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.561985 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-694db87c64-qrwhp" event={"ID":"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37","Type":"ContainerStarted","Data":"849b2badb825e078b2b6f0b3b70a51c8064145288c1cf2da692d15a9af8c3e09"} Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.582328 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-2npfw" podStartSLOduration=3.243664974 podStartE2EDuration="18.582312027s" podCreationTimestamp="2025-09-30 07:50:15 +0000 UTC" firstStartedPulling="2025-09-30 07:50:16.774941317 +0000 UTC m=+1002.417847729" lastFinishedPulling="2025-09-30 07:50:32.11358837 +0000 UTC m=+1017.756494782" observedRunningTime="2025-09-30 07:50:33.575564915 +0000 UTC m=+1019.218471327" watchObservedRunningTime="2025-09-30 07:50:33.582312027 +0000 UTC m=+1019.225218439" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.583897 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d547648b9-kxd9p" event={"ID":"b7668a34-2cd6-4873-a55a-27839a612a2b","Type":"ContainerStarted","Data":"c9f51e05fcba75797df060a5a1ac980cc089f7cde31e72b0efb1fc5117351629"} Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.583936 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d547648b9-kxd9p" event={"ID":"b7668a34-2cd6-4873-a55a-27839a612a2b","Type":"ContainerStarted","Data":"7da8cf7bc3a971e417b13433c8641faa22c49b2b1b7fd3507892848e0fb76724"} Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.584061 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6d547648b9-kxd9p" podUID="b7668a34-2cd6-4873-a55a-27839a612a2b" containerName="horizon-log" containerID="cri-o://7da8cf7bc3a971e417b13433c8641faa22c49b2b1b7fd3507892848e0fb76724" gracePeriod=30 Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.584166 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6d547648b9-kxd9p" podUID="b7668a34-2cd6-4873-a55a-27839a612a2b" containerName="horizon" containerID="cri-o://c9f51e05fcba75797df060a5a1ac980cc089f7cde31e72b0efb1fc5117351629" gracePeriod=30 Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.609244 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"7c12cbe5-a6fb-4ead-bb65-cd13dab410ce","Type":"ContainerStarted","Data":"de70f33db9068296c53a57cfae393ca60f0587ed65b3b5244c29472988122a74"} Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.625842 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5","Type":"ContainerStarted","Data":"8bb4ae2b4cdc4f4f073c4d86092669e02429c7f687807aac45ca0ee0b9dbe8d5"} Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.625899 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5","Type":"ContainerStarted","Data":"56a489224c6b0fd5341421022afd3b3837a95b15aae7004d350de464f74d61a2"} Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.629530 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.643976 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="f1e2802c-ec39-48ee-a6c4-b609a57aa0c5" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.154:9322/\": dial tcp 10.217.0.154:9322: connect: connection refused" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.651410 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5885b885f5-7v6mw" event={"ID":"0d907232-c03b-44a4-a0b5-36ce5bf6d62b","Type":"ContainerStarted","Data":"31ed2dc075a4352931d059eb24b16e55e055f8e5fe67745ba3017d7bc673416e"} Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.651468 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5885b885f5-7v6mw" event={"ID":"0d907232-c03b-44a4-a0b5-36ce5bf6d62b","Type":"ContainerStarted","Data":"2e8e9f85e632b06032bb32e36e15e521539ad1bfeea1fcaa3c7fb720729d78cf"} Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.651557 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5885b885f5-7v6mw" podUID="0d907232-c03b-44a4-a0b5-36ce5bf6d62b" containerName="horizon" containerID="cri-o://31ed2dc075a4352931d059eb24b16e55e055f8e5fe67745ba3017d7bc673416e" gracePeriod=30 Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.651627 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5885b885f5-7v6mw" podUID="0d907232-c03b-44a4-a0b5-36ce5bf6d62b" containerName="horizon-log" containerID="cri-o://2e8e9f85e632b06032bb32e36e15e521539ad1bfeea1fcaa3c7fb720729d78cf" gracePeriod=30 Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.672036 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6d547648b9-kxd9p" podStartSLOduration=3.021093049 podStartE2EDuration="16.672019994s" podCreationTimestamp="2025-09-30 07:50:17 +0000 UTC" firstStartedPulling="2025-09-30 07:50:18.605295582 +0000 UTC m=+1004.248201994" lastFinishedPulling="2025-09-30 07:50:32.256222517 +0000 UTC m=+1017.899128939" observedRunningTime="2025-09-30 07:50:33.64166287 +0000 UTC m=+1019.284569282" watchObservedRunningTime="2025-09-30 07:50:33.672019994 +0000 UTC m=+1019.314926406" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.696320 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=9.696286692 podStartE2EDuration="9.696286692s" podCreationTimestamp="2025-09-30 07:50:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:50:33.680785847 +0000 UTC m=+1019.323692259" watchObservedRunningTime="2025-09-30 07:50:33.696286692 +0000 UTC m=+1019.339193174" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.714968 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5885b885f5-7v6mw" podStartSLOduration=3.367310076 podStartE2EDuration="18.714951408s" podCreationTimestamp="2025-09-30 07:50:15 +0000 UTC" firstStartedPulling="2025-09-30 07:50:16.752472625 +0000 UTC m=+1002.395379037" lastFinishedPulling="2025-09-30 07:50:32.100113937 +0000 UTC m=+1017.743020369" observedRunningTime="2025-09-30 07:50:33.703834815 +0000 UTC m=+1019.346741227" watchObservedRunningTime="2025-09-30 07:50:33.714951408 +0000 UTC m=+1019.357857820" Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.732383 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75644c8bb4-wrsmv" event={"ID":"8b39ba3e-25df-4a22-a1fe-f15e6ca1fada","Type":"ContainerStarted","Data":"65dbcd9281f3c6b09eb431a1d627e147617c793e0b5765f9b416e4e627514765"} Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.993080 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 07:50:33 crc kubenswrapper[4760]: I0930 07:50:33.996270 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.003778 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.003955 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.004029 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lnkxp" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.023968 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.063257 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-fl2rf"] Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.148646 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.148713 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-config-data\") pod \"glance-default-external-api-0\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.148740 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-scripts\") pod \"glance-default-external-api-0\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.148910 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.148946 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-logs\") pod \"glance-default-external-api-0\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.149002 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.149027 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8rkd\" (UniqueName: \"kubernetes.io/projected/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-kube-api-access-r8rkd\") pod \"glance-default-external-api-0\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.250623 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.250693 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-logs\") pod \"glance-default-external-api-0\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.250749 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.250782 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8rkd\" (UniqueName: \"kubernetes.io/projected/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-kube-api-access-r8rkd\") pod \"glance-default-external-api-0\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.250812 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.250840 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-config-data\") pod \"glance-default-external-api-0\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.250861 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-scripts\") pod \"glance-default-external-api-0\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.251937 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.252358 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.252425 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-logs\") pod \"glance-default-external-api-0\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.257502 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.258363 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-config-data\") pod \"glance-default-external-api-0\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.258882 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-scripts\") pod \"glance-default-external-api-0\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.279956 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8rkd\" (UniqueName: \"kubernetes.io/projected/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-kube-api-access-r8rkd\") pod \"glance-default-external-api-0\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.305748 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.345840 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.371857 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.371889 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.435452 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.436978 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.442648 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.448684 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.561384 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.561480 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.561521 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.561582 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.561642 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-logs\") pod \"glance-default-internal-api-0\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.561665 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.561740 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2ksk\" (UniqueName: \"kubernetes.io/projected/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-kube-api-access-m2ksk\") pod \"glance-default-internal-api-0\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.662961 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-logs\") pod \"glance-default-internal-api-0\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.663001 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.663470 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2ksk\" (UniqueName: \"kubernetes.io/projected/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-kube-api-access-m2ksk\") pod \"glance-default-internal-api-0\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.663536 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.663570 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.663612 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.663666 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.667828 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.674576 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-logs\") pod \"glance-default-internal-api-0\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.675176 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.687035 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.691097 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2ksk\" (UniqueName: \"kubernetes.io/projected/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-kube-api-access-m2ksk\") pod \"glance-default-internal-api-0\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.691414 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.703921 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.723401 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.748527 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-694db87c64-qrwhp" event={"ID":"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37","Type":"ContainerStarted","Data":"ffd16f7b799eef3e712c57eb371aed74561eea257b32f299da023db303acb61b"} Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.748582 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-694db87c64-qrwhp" event={"ID":"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37","Type":"ContainerStarted","Data":"5d579c31c3d1a80d0194fb2ea0a7ddf6de3190ad92735e3f6c6c498a6b3f551b"} Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.765516 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6g2pm" event={"ID":"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c","Type":"ContainerStarted","Data":"cd34ed4a3c768321b03d3c69ae8fc0b614c7141d5b4eb8f21020250d5c60803a"} Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.765566 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6g2pm" event={"ID":"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c","Type":"ContainerStarted","Data":"5bb90939a53e928c2fb3552210f3a0ec7c29aa81e33180e3b9457a88a25d3725"} Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.775882 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.776885 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-694db87c64-qrwhp" podStartSLOduration=10.776870022 podStartE2EDuration="10.776870022s" podCreationTimestamp="2025-09-30 07:50:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:50:34.7732556 +0000 UTC m=+1020.416162022" watchObservedRunningTime="2025-09-30 07:50:34.776870022 +0000 UTC m=+1020.419776434" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.799632 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75644c8bb4-wrsmv" event={"ID":"8b39ba3e-25df-4a22-a1fe-f15e6ca1fada","Type":"ContainerStarted","Data":"874e8d3794705d963dd1f922c0a3c13e380eaf7a9fe0420397c7aaad3bf17f80"} Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.799673 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75644c8bb4-wrsmv" event={"ID":"8b39ba3e-25df-4a22-a1fe-f15e6ca1fada","Type":"ContainerStarted","Data":"27b31b4e67f9e40059f68e56e5f69c571f8d3000c8062c05281cdf97430df4c5"} Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.802991 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-6g2pm" podStartSLOduration=5.802978008 podStartE2EDuration="5.802978008s" podCreationTimestamp="2025-09-30 07:50:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:50:34.799736766 +0000 UTC m=+1020.442643168" watchObservedRunningTime="2025-09-30 07:50:34.802978008 +0000 UTC m=+1020.445884420" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.812104 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8vvnb" event={"ID":"108a4c03-5bd3-45d0-a13d-b67e01bd7654","Type":"ContainerStarted","Data":"34df7a6fed2b13c4f77c2412b04d3044f9f81e01a69fa17b070335180992cc7c"} Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.812139 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8vvnb" event={"ID":"108a4c03-5bd3-45d0-a13d-b67e01bd7654","Type":"ContainerStarted","Data":"cf259968486ff404265fc3045cf68173ccb6c56c32ee9888c1ff8a40ae8a8525"} Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.814550 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"bd8762e1-d4b3-4999-996e-db79b881afec","Type":"ContainerStarted","Data":"f89cab0426974564e60181e9bbc821f482eb63d851410f3e9f8e576e886823d2"} Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.817206 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5","Type":"ContainerStarted","Data":"f0aeb72fd3b5bc55a798002692855ffad1e3d949c0cd84e5255f5d2652bc2d77"} Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.848962 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-75644c8bb4-wrsmv" podStartSLOduration=10.84893845 podStartE2EDuration="10.84893845s" podCreationTimestamp="2025-09-30 07:50:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:50:34.821063789 +0000 UTC m=+1020.463970201" watchObservedRunningTime="2025-09-30 07:50:34.84893845 +0000 UTC m=+1020.491844862" Sep 30 07:50:34 crc kubenswrapper[4760]: I0930 07:50:34.896567 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-8vvnb" podStartSLOduration=7.896533793 podStartE2EDuration="7.896533793s" podCreationTimestamp="2025-09-30 07:50:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:50:34.85011953 +0000 UTC m=+1020.493025962" watchObservedRunningTime="2025-09-30 07:50:34.896533793 +0000 UTC m=+1020.539440215" Sep 30 07:50:35 crc kubenswrapper[4760]: I0930 07:50:35.415481 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/watcher-api-0" podUID="f1e2802c-ec39-48ee-a6c4-b609a57aa0c5" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.154:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 07:50:35 crc kubenswrapper[4760]: I0930 07:50:35.733088 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-746f884dcc-8lkjw" Sep 30 07:50:35 crc kubenswrapper[4760]: I0930 07:50:35.827397 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 07:50:35 crc kubenswrapper[4760]: I0930 07:50:35.942455 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5885b885f5-7v6mw" Sep 30 07:50:36 crc kubenswrapper[4760]: I0930 07:50:36.273165 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 07:50:36 crc kubenswrapper[4760]: I0930 07:50:36.356170 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 07:50:36 crc kubenswrapper[4760]: I0930 07:50:36.839333 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" event={"ID":"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c","Type":"ContainerStarted","Data":"8cdc5ee661f4929bf9f0042161435f57608e5857963979360396e8870bd9a264"} Sep 30 07:50:36 crc kubenswrapper[4760]: I0930 07:50:36.840755 4760 generic.go:334] "Generic (PLEG): container finished" podID="4584a821-629e-4246-95d3-b84160a0f46c" containerID="7ff8179eb22158ab5265a3f91f89d11c80bc5c5c69813ac725d688b5bbaf9c3c" exitCode=0 Sep 30 07:50:36 crc kubenswrapper[4760]: I0930 07:50:36.840789 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2npfw" event={"ID":"4584a821-629e-4246-95d3-b84160a0f46c","Type":"ContainerDied","Data":"7ff8179eb22158ab5265a3f91f89d11c80bc5c5c69813ac725d688b5bbaf9c3c"} Sep 30 07:50:37 crc kubenswrapper[4760]: I0930 07:50:37.420236 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Sep 30 07:50:37 crc kubenswrapper[4760]: I0930 07:50:37.976092 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6d547648b9-kxd9p" Sep 30 07:50:38 crc kubenswrapper[4760]: I0930 07:50:38.864401 4760 generic.go:334] "Generic (PLEG): container finished" podID="e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c" containerID="cd34ed4a3c768321b03d3c69ae8fc0b614c7141d5b4eb8f21020250d5c60803a" exitCode=0 Sep 30 07:50:38 crc kubenswrapper[4760]: I0930 07:50:38.864591 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6g2pm" event={"ID":"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c","Type":"ContainerDied","Data":"cd34ed4a3c768321b03d3c69ae8fc0b614c7141d5b4eb8f21020250d5c60803a"} Sep 30 07:50:40 crc kubenswrapper[4760]: I0930 07:50:40.890022 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2npfw" event={"ID":"4584a821-629e-4246-95d3-b84160a0f46c","Type":"ContainerDied","Data":"4697db6b8dedf7f8401d4c04ddc763909ddbdbe1dcc30888f4966672fbec9aee"} Sep 30 07:50:40 crc kubenswrapper[4760]: I0930 07:50:40.890294 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4697db6b8dedf7f8401d4c04ddc763909ddbdbe1dcc30888f4966672fbec9aee" Sep 30 07:50:40 crc kubenswrapper[4760]: I0930 07:50:40.897482 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6g2pm" event={"ID":"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c","Type":"ContainerDied","Data":"5bb90939a53e928c2fb3552210f3a0ec7c29aa81e33180e3b9457a88a25d3725"} Sep 30 07:50:40 crc kubenswrapper[4760]: I0930 07:50:40.897519 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bb90939a53e928c2fb3552210f3a0ec7c29aa81e33180e3b9457a88a25d3725" Sep 30 07:50:40 crc kubenswrapper[4760]: I0930 07:50:40.915333 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2npfw" Sep 30 07:50:40 crc kubenswrapper[4760]: I0930 07:50:40.922252 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6g2pm" Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.001813 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-fernet-keys\") pod \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\" (UID: \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\") " Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.001900 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4584a821-629e-4246-95d3-b84160a0f46c-logs\") pod \"4584a821-629e-4246-95d3-b84160a0f46c\" (UID: \"4584a821-629e-4246-95d3-b84160a0f46c\") " Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.001960 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4gh5\" (UniqueName: \"kubernetes.io/projected/4584a821-629e-4246-95d3-b84160a0f46c-kube-api-access-j4gh5\") pod \"4584a821-629e-4246-95d3-b84160a0f46c\" (UID: \"4584a821-629e-4246-95d3-b84160a0f46c\") " Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.002365 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4584a821-629e-4246-95d3-b84160a0f46c-logs" (OuterVolumeSpecName: "logs") pod "4584a821-629e-4246-95d3-b84160a0f46c" (UID: "4584a821-629e-4246-95d3-b84160a0f46c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.002926 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-scripts\") pod \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\" (UID: \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\") " Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.002985 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4584a821-629e-4246-95d3-b84160a0f46c-combined-ca-bundle\") pod \"4584a821-629e-4246-95d3-b84160a0f46c\" (UID: \"4584a821-629e-4246-95d3-b84160a0f46c\") " Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.003047 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vf4z\" (UniqueName: \"kubernetes.io/projected/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-kube-api-access-7vf4z\") pod \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\" (UID: \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\") " Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.003088 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4584a821-629e-4246-95d3-b84160a0f46c-config-data\") pod \"4584a821-629e-4246-95d3-b84160a0f46c\" (UID: \"4584a821-629e-4246-95d3-b84160a0f46c\") " Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.003132 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-credential-keys\") pod \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\" (UID: \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\") " Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.003278 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4584a821-629e-4246-95d3-b84160a0f46c-scripts\") pod \"4584a821-629e-4246-95d3-b84160a0f46c\" (UID: \"4584a821-629e-4246-95d3-b84160a0f46c\") " Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.003326 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-combined-ca-bundle\") pod \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\" (UID: \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\") " Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.003369 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-config-data\") pod \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\" (UID: \"e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c\") " Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.003967 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4584a821-629e-4246-95d3-b84160a0f46c-logs\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.012527 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4584a821-629e-4246-95d3-b84160a0f46c-kube-api-access-j4gh5" (OuterVolumeSpecName: "kube-api-access-j4gh5") pod "4584a821-629e-4246-95d3-b84160a0f46c" (UID: "4584a821-629e-4246-95d3-b84160a0f46c"). InnerVolumeSpecName "kube-api-access-j4gh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.012566 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-scripts" (OuterVolumeSpecName: "scripts") pod "e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c" (UID: "e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.012600 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c" (UID: "e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.022445 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-kube-api-access-7vf4z" (OuterVolumeSpecName: "kube-api-access-7vf4z") pod "e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c" (UID: "e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c"). InnerVolumeSpecName "kube-api-access-7vf4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.022502 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4584a821-629e-4246-95d3-b84160a0f46c-scripts" (OuterVolumeSpecName: "scripts") pod "4584a821-629e-4246-95d3-b84160a0f46c" (UID: "4584a821-629e-4246-95d3-b84160a0f46c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.027292 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c" (UID: "e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.074750 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4584a821-629e-4246-95d3-b84160a0f46c-config-data" (OuterVolumeSpecName: "config-data") pod "4584a821-629e-4246-95d3-b84160a0f46c" (UID: "4584a821-629e-4246-95d3-b84160a0f46c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.074770 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c" (UID: "e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.085216 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-config-data" (OuterVolumeSpecName: "config-data") pod "e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c" (UID: "e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.106014 4760 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.106041 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4gh5\" (UniqueName: \"kubernetes.io/projected/4584a821-629e-4246-95d3-b84160a0f46c-kube-api-access-j4gh5\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.106053 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.106062 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vf4z\" (UniqueName: \"kubernetes.io/projected/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-kube-api-access-7vf4z\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.106070 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4584a821-629e-4246-95d3-b84160a0f46c-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.106078 4760 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.106086 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4584a821-629e-4246-95d3-b84160a0f46c-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.106093 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.106100 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.118500 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4584a821-629e-4246-95d3-b84160a0f46c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4584a821-629e-4246-95d3-b84160a0f46c" (UID: "4584a821-629e-4246-95d3-b84160a0f46c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.215970 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4584a821-629e-4246-95d3-b84160a0f46c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.551710 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.914736 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6g2pm" Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.915520 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8a3b79f1-3bbe-4418-9a81-4190a602dd5d","Type":"ContainerStarted","Data":"ed00058a5fde843bca2ea5750acc9a0c793259cd6244a13d4bf73c540bddf156"} Sep 30 07:50:41 crc kubenswrapper[4760]: I0930 07:50:41.915713 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2npfw" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.186384 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-844b758db4-hzncj"] Sep 30 07:50:42 crc kubenswrapper[4760]: E0930 07:50:42.187277 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4584a821-629e-4246-95d3-b84160a0f46c" containerName="placement-db-sync" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.187322 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="4584a821-629e-4246-95d3-b84160a0f46c" containerName="placement-db-sync" Sep 30 07:50:42 crc kubenswrapper[4760]: E0930 07:50:42.187353 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c" containerName="keystone-bootstrap" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.187365 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c" containerName="keystone-bootstrap" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.187599 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c" containerName="keystone-bootstrap" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.187629 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="4584a821-629e-4246-95d3-b84160a0f46c" containerName="placement-db-sync" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.188945 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-844b758db4-hzncj" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.190770 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-844b758db4-hzncj"] Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.198794 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.199125 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.199500 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.199563 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-wrjwv" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.199723 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.202379 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-77987c8bb7-t2mw2"] Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.204335 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.212639 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.213002 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.213202 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-gx9h7" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.213346 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.213808 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.216726 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.227105 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-77987c8bb7-t2mw2"] Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.290166 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d49cf7a-b821-4677-88fe-8fac1dbced63-public-tls-certs\") pod \"keystone-77987c8bb7-t2mw2\" (UID: \"9d49cf7a-b821-4677-88fe-8fac1dbced63\") " pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.290220 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d49cf7a-b821-4677-88fe-8fac1dbced63-scripts\") pod \"keystone-77987c8bb7-t2mw2\" (UID: \"9d49cf7a-b821-4677-88fe-8fac1dbced63\") " pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.290246 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d49cf7a-b821-4677-88fe-8fac1dbced63-internal-tls-certs\") pod \"keystone-77987c8bb7-t2mw2\" (UID: \"9d49cf7a-b821-4677-88fe-8fac1dbced63\") " pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.290331 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s98fh\" (UniqueName: \"kubernetes.io/projected/9d49cf7a-b821-4677-88fe-8fac1dbced63-kube-api-access-s98fh\") pod \"keystone-77987c8bb7-t2mw2\" (UID: \"9d49cf7a-b821-4677-88fe-8fac1dbced63\") " pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.290352 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6cfc37b-8ee0-4efe-a43f-b53bafbf4255-logs\") pod \"placement-844b758db4-hzncj\" (UID: \"a6cfc37b-8ee0-4efe-a43f-b53bafbf4255\") " pod="openstack/placement-844b758db4-hzncj" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.290382 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9d49cf7a-b821-4677-88fe-8fac1dbced63-credential-keys\") pod \"keystone-77987c8bb7-t2mw2\" (UID: \"9d49cf7a-b821-4677-88fe-8fac1dbced63\") " pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.290427 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9d49cf7a-b821-4677-88fe-8fac1dbced63-fernet-keys\") pod \"keystone-77987c8bb7-t2mw2\" (UID: \"9d49cf7a-b821-4677-88fe-8fac1dbced63\") " pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.290461 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6cfc37b-8ee0-4efe-a43f-b53bafbf4255-config-data\") pod \"placement-844b758db4-hzncj\" (UID: \"a6cfc37b-8ee0-4efe-a43f-b53bafbf4255\") " pod="openstack/placement-844b758db4-hzncj" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.290529 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6cfc37b-8ee0-4efe-a43f-b53bafbf4255-internal-tls-certs\") pod \"placement-844b758db4-hzncj\" (UID: \"a6cfc37b-8ee0-4efe-a43f-b53bafbf4255\") " pod="openstack/placement-844b758db4-hzncj" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.290553 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6cfc37b-8ee0-4efe-a43f-b53bafbf4255-public-tls-certs\") pod \"placement-844b758db4-hzncj\" (UID: \"a6cfc37b-8ee0-4efe-a43f-b53bafbf4255\") " pod="openstack/placement-844b758db4-hzncj" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.290575 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pldk8\" (UniqueName: \"kubernetes.io/projected/a6cfc37b-8ee0-4efe-a43f-b53bafbf4255-kube-api-access-pldk8\") pod \"placement-844b758db4-hzncj\" (UID: \"a6cfc37b-8ee0-4efe-a43f-b53bafbf4255\") " pod="openstack/placement-844b758db4-hzncj" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.290598 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d49cf7a-b821-4677-88fe-8fac1dbced63-config-data\") pod \"keystone-77987c8bb7-t2mw2\" (UID: \"9d49cf7a-b821-4677-88fe-8fac1dbced63\") " pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.290646 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d49cf7a-b821-4677-88fe-8fac1dbced63-combined-ca-bundle\") pod \"keystone-77987c8bb7-t2mw2\" (UID: \"9d49cf7a-b821-4677-88fe-8fac1dbced63\") " pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.290691 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6cfc37b-8ee0-4efe-a43f-b53bafbf4255-combined-ca-bundle\") pod \"placement-844b758db4-hzncj\" (UID: \"a6cfc37b-8ee0-4efe-a43f-b53bafbf4255\") " pod="openstack/placement-844b758db4-hzncj" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.290737 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6cfc37b-8ee0-4efe-a43f-b53bafbf4255-scripts\") pod \"placement-844b758db4-hzncj\" (UID: \"a6cfc37b-8ee0-4efe-a43f-b53bafbf4255\") " pod="openstack/placement-844b758db4-hzncj" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.395276 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d49cf7a-b821-4677-88fe-8fac1dbced63-config-data\") pod \"keystone-77987c8bb7-t2mw2\" (UID: \"9d49cf7a-b821-4677-88fe-8fac1dbced63\") " pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.395355 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d49cf7a-b821-4677-88fe-8fac1dbced63-combined-ca-bundle\") pod \"keystone-77987c8bb7-t2mw2\" (UID: \"9d49cf7a-b821-4677-88fe-8fac1dbced63\") " pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.395422 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6cfc37b-8ee0-4efe-a43f-b53bafbf4255-combined-ca-bundle\") pod \"placement-844b758db4-hzncj\" (UID: \"a6cfc37b-8ee0-4efe-a43f-b53bafbf4255\") " pod="openstack/placement-844b758db4-hzncj" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.395437 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6cfc37b-8ee0-4efe-a43f-b53bafbf4255-scripts\") pod \"placement-844b758db4-hzncj\" (UID: \"a6cfc37b-8ee0-4efe-a43f-b53bafbf4255\") " pod="openstack/placement-844b758db4-hzncj" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.395455 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d49cf7a-b821-4677-88fe-8fac1dbced63-public-tls-certs\") pod \"keystone-77987c8bb7-t2mw2\" (UID: \"9d49cf7a-b821-4677-88fe-8fac1dbced63\") " pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.395479 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d49cf7a-b821-4677-88fe-8fac1dbced63-scripts\") pod \"keystone-77987c8bb7-t2mw2\" (UID: \"9d49cf7a-b821-4677-88fe-8fac1dbced63\") " pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.395496 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d49cf7a-b821-4677-88fe-8fac1dbced63-internal-tls-certs\") pod \"keystone-77987c8bb7-t2mw2\" (UID: \"9d49cf7a-b821-4677-88fe-8fac1dbced63\") " pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.395517 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s98fh\" (UniqueName: \"kubernetes.io/projected/9d49cf7a-b821-4677-88fe-8fac1dbced63-kube-api-access-s98fh\") pod \"keystone-77987c8bb7-t2mw2\" (UID: \"9d49cf7a-b821-4677-88fe-8fac1dbced63\") " pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.395533 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6cfc37b-8ee0-4efe-a43f-b53bafbf4255-logs\") pod \"placement-844b758db4-hzncj\" (UID: \"a6cfc37b-8ee0-4efe-a43f-b53bafbf4255\") " pod="openstack/placement-844b758db4-hzncj" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.395556 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9d49cf7a-b821-4677-88fe-8fac1dbced63-credential-keys\") pod \"keystone-77987c8bb7-t2mw2\" (UID: \"9d49cf7a-b821-4677-88fe-8fac1dbced63\") " pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.395592 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9d49cf7a-b821-4677-88fe-8fac1dbced63-fernet-keys\") pod \"keystone-77987c8bb7-t2mw2\" (UID: \"9d49cf7a-b821-4677-88fe-8fac1dbced63\") " pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.395617 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6cfc37b-8ee0-4efe-a43f-b53bafbf4255-config-data\") pod \"placement-844b758db4-hzncj\" (UID: \"a6cfc37b-8ee0-4efe-a43f-b53bafbf4255\") " pod="openstack/placement-844b758db4-hzncj" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.395665 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6cfc37b-8ee0-4efe-a43f-b53bafbf4255-internal-tls-certs\") pod \"placement-844b758db4-hzncj\" (UID: \"a6cfc37b-8ee0-4efe-a43f-b53bafbf4255\") " pod="openstack/placement-844b758db4-hzncj" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.395688 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6cfc37b-8ee0-4efe-a43f-b53bafbf4255-public-tls-certs\") pod \"placement-844b758db4-hzncj\" (UID: \"a6cfc37b-8ee0-4efe-a43f-b53bafbf4255\") " pod="openstack/placement-844b758db4-hzncj" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.395704 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pldk8\" (UniqueName: \"kubernetes.io/projected/a6cfc37b-8ee0-4efe-a43f-b53bafbf4255-kube-api-access-pldk8\") pod \"placement-844b758db4-hzncj\" (UID: \"a6cfc37b-8ee0-4efe-a43f-b53bafbf4255\") " pod="openstack/placement-844b758db4-hzncj" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.416492 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6cfc37b-8ee0-4efe-a43f-b53bafbf4255-combined-ca-bundle\") pod \"placement-844b758db4-hzncj\" (UID: \"a6cfc37b-8ee0-4efe-a43f-b53bafbf4255\") " pod="openstack/placement-844b758db4-hzncj" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.416753 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6cfc37b-8ee0-4efe-a43f-b53bafbf4255-logs\") pod \"placement-844b758db4-hzncj\" (UID: \"a6cfc37b-8ee0-4efe-a43f-b53bafbf4255\") " pod="openstack/placement-844b758db4-hzncj" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.417975 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9d49cf7a-b821-4677-88fe-8fac1dbced63-fernet-keys\") pod \"keystone-77987c8bb7-t2mw2\" (UID: \"9d49cf7a-b821-4677-88fe-8fac1dbced63\") " pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.422464 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d49cf7a-b821-4677-88fe-8fac1dbced63-scripts\") pod \"keystone-77987c8bb7-t2mw2\" (UID: \"9d49cf7a-b821-4677-88fe-8fac1dbced63\") " pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.422923 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6cfc37b-8ee0-4efe-a43f-b53bafbf4255-config-data\") pod \"placement-844b758db4-hzncj\" (UID: \"a6cfc37b-8ee0-4efe-a43f-b53bafbf4255\") " pod="openstack/placement-844b758db4-hzncj" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.432017 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6cfc37b-8ee0-4efe-a43f-b53bafbf4255-scripts\") pod \"placement-844b758db4-hzncj\" (UID: \"a6cfc37b-8ee0-4efe-a43f-b53bafbf4255\") " pod="openstack/placement-844b758db4-hzncj" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.432605 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d49cf7a-b821-4677-88fe-8fac1dbced63-public-tls-certs\") pod \"keystone-77987c8bb7-t2mw2\" (UID: \"9d49cf7a-b821-4677-88fe-8fac1dbced63\") " pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.432942 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6cfc37b-8ee0-4efe-a43f-b53bafbf4255-internal-tls-certs\") pod \"placement-844b758db4-hzncj\" (UID: \"a6cfc37b-8ee0-4efe-a43f-b53bafbf4255\") " pod="openstack/placement-844b758db4-hzncj" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.433252 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6cfc37b-8ee0-4efe-a43f-b53bafbf4255-public-tls-certs\") pod \"placement-844b758db4-hzncj\" (UID: \"a6cfc37b-8ee0-4efe-a43f-b53bafbf4255\") " pod="openstack/placement-844b758db4-hzncj" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.433514 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d49cf7a-b821-4677-88fe-8fac1dbced63-internal-tls-certs\") pod \"keystone-77987c8bb7-t2mw2\" (UID: \"9d49cf7a-b821-4677-88fe-8fac1dbced63\") " pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.433691 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d49cf7a-b821-4677-88fe-8fac1dbced63-combined-ca-bundle\") pod \"keystone-77987c8bb7-t2mw2\" (UID: \"9d49cf7a-b821-4677-88fe-8fac1dbced63\") " pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.434033 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d49cf7a-b821-4677-88fe-8fac1dbced63-config-data\") pod \"keystone-77987c8bb7-t2mw2\" (UID: \"9d49cf7a-b821-4677-88fe-8fac1dbced63\") " pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.434184 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pldk8\" (UniqueName: \"kubernetes.io/projected/a6cfc37b-8ee0-4efe-a43f-b53bafbf4255-kube-api-access-pldk8\") pod \"placement-844b758db4-hzncj\" (UID: \"a6cfc37b-8ee0-4efe-a43f-b53bafbf4255\") " pod="openstack/placement-844b758db4-hzncj" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.434560 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9d49cf7a-b821-4677-88fe-8fac1dbced63-credential-keys\") pod \"keystone-77987c8bb7-t2mw2\" (UID: \"9d49cf7a-b821-4677-88fe-8fac1dbced63\") " pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.446369 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s98fh\" (UniqueName: \"kubernetes.io/projected/9d49cf7a-b821-4677-88fe-8fac1dbced63-kube-api-access-s98fh\") pod \"keystone-77987c8bb7-t2mw2\" (UID: \"9d49cf7a-b821-4677-88fe-8fac1dbced63\") " pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.593899 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-844b758db4-hzncj" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.603372 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.610778 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.954647 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f6j6q" event={"ID":"1df67641-4598-4ba5-a59a-a195084e5446","Type":"ContainerStarted","Data":"b441e987ef7decf049664c413470ea11bd7c6b9be8c56175022120bb2f98ef6f"} Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.990003 4760 generic.go:334] "Generic (PLEG): container finished" podID="c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c" containerID="887bc79c81f53af4078394c397831514beae81abf6e2314309f726c5d5db7a4a" exitCode=0 Sep 30 07:50:42 crc kubenswrapper[4760]: I0930 07:50:42.990096 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" event={"ID":"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c","Type":"ContainerDied","Data":"887bc79c81f53af4078394c397831514beae81abf6e2314309f726c5d5db7a4a"} Sep 30 07:50:43 crc kubenswrapper[4760]: I0930 07:50:43.012154 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-f6j6q" podStartSLOduration=7.235521449 podStartE2EDuration="16.012132252s" podCreationTimestamp="2025-09-30 07:50:27 +0000 UTC" firstStartedPulling="2025-09-30 07:50:33.106342831 +0000 UTC m=+1018.749249243" lastFinishedPulling="2025-09-30 07:50:41.882953634 +0000 UTC m=+1027.525860046" observedRunningTime="2025-09-30 07:50:42.979615503 +0000 UTC m=+1028.622521915" watchObservedRunningTime="2025-09-30 07:50:43.012132252 +0000 UTC m=+1028.655038664" Sep 30 07:50:43 crc kubenswrapper[4760]: I0930 07:50:43.029513 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"7c12cbe5-a6fb-4ead-bb65-cd13dab410ce","Type":"ContainerStarted","Data":"87f8a4b3a3c10f19e9b6079e808b7e0c0e075691d0534cd9207a4a4b36c9603c"} Sep 30 07:50:43 crc kubenswrapper[4760]: I0930 07:50:43.033003 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c588f954-a6b5-413f-bcc8-dbfe4c660d1b","Type":"ContainerStarted","Data":"ef9d2be6e5aa8cb925213a3b56f37367a3e3ce662a468e00b2da2bda3a7274f7"} Sep 30 07:50:43 crc kubenswrapper[4760]: I0930 07:50:43.036614 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"bd8762e1-d4b3-4999-996e-db79b881afec","Type":"ContainerStarted","Data":"29f139ab93d44d0a73984470f558ba725ce037a4502421536c0f4d3bf6bde826"} Sep 30 07:50:43 crc kubenswrapper[4760]: I0930 07:50:43.045941 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=10.575652171 podStartE2EDuration="19.045924633s" podCreationTimestamp="2025-09-30 07:50:24 +0000 UTC" firstStartedPulling="2025-09-30 07:50:33.341584949 +0000 UTC m=+1018.984491361" lastFinishedPulling="2025-09-30 07:50:41.811857381 +0000 UTC m=+1027.454763823" observedRunningTime="2025-09-30 07:50:43.04342601 +0000 UTC m=+1028.686332422" watchObservedRunningTime="2025-09-30 07:50:43.045924633 +0000 UTC m=+1028.688831045" Sep 30 07:50:43 crc kubenswrapper[4760]: I0930 07:50:43.099786 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=10.837152779 podStartE2EDuration="19.099766316s" podCreationTimestamp="2025-09-30 07:50:24 +0000 UTC" firstStartedPulling="2025-09-30 07:50:33.546975756 +0000 UTC m=+1019.189882168" lastFinishedPulling="2025-09-30 07:50:41.809589263 +0000 UTC m=+1027.452495705" observedRunningTime="2025-09-30 07:50:43.068335825 +0000 UTC m=+1028.711242257" watchObservedRunningTime="2025-09-30 07:50:43.099766316 +0000 UTC m=+1028.742672728" Sep 30 07:50:43 crc kubenswrapper[4760]: I0930 07:50:43.137488 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a555f66-7027-4fac-afcc-db7b3f5ae034","Type":"ContainerStarted","Data":"d4afbc82bd27bf72ecee192b66554d47b2560bf5bd028bf7135ba8e44fb51690"} Sep 30 07:50:43 crc kubenswrapper[4760]: I0930 07:50:43.326772 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-844b758db4-hzncj"] Sep 30 07:50:43 crc kubenswrapper[4760]: I0930 07:50:43.589952 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-77987c8bb7-t2mw2"] Sep 30 07:50:44 crc kubenswrapper[4760]: I0930 07:50:44.111505 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8a3b79f1-3bbe-4418-9a81-4190a602dd5d","Type":"ContainerStarted","Data":"0c02b923b1bc6811bca16a8db1ba05951c985fbaa73046709d6d23369e97b202"} Sep 30 07:50:44 crc kubenswrapper[4760]: I0930 07:50:44.123651 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" event={"ID":"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c","Type":"ContainerStarted","Data":"b186dab5b4052d583a6463fe37374ed0f3bd809879e604685545e4398cd76d19"} Sep 30 07:50:44 crc kubenswrapper[4760]: I0930 07:50:44.124897 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" Sep 30 07:50:44 crc kubenswrapper[4760]: I0930 07:50:44.127119 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-844b758db4-hzncj" event={"ID":"a6cfc37b-8ee0-4efe-a43f-b53bafbf4255","Type":"ContainerStarted","Data":"648fadf5288fefc39b1fb9d9e4b23863b35062bb347949f6822bc2286dfac6b3"} Sep 30 07:50:44 crc kubenswrapper[4760]: I0930 07:50:44.127155 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-844b758db4-hzncj" event={"ID":"a6cfc37b-8ee0-4efe-a43f-b53bafbf4255","Type":"ContainerStarted","Data":"6a652d8cb2cc77a2b003c37d26160411eb98643065393a6c14a5525a6f31bc51"} Sep 30 07:50:44 crc kubenswrapper[4760]: I0930 07:50:44.133272 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-77987c8bb7-t2mw2" event={"ID":"9d49cf7a-b821-4677-88fe-8fac1dbced63","Type":"ContainerStarted","Data":"c3007b05b3742a6f10192598f63f90933ec59ff553cb2220da9c4bab634b3231"} Sep 30 07:50:44 crc kubenswrapper[4760]: I0930 07:50:44.145198 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" podStartSLOduration=11.1451816 podStartE2EDuration="11.1451816s" podCreationTimestamp="2025-09-30 07:50:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:50:44.140556552 +0000 UTC m=+1029.783462974" watchObservedRunningTime="2025-09-30 07:50:44.1451816 +0000 UTC m=+1029.788088012" Sep 30 07:50:44 crc kubenswrapper[4760]: I0930 07:50:44.378264 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Sep 30 07:50:44 crc kubenswrapper[4760]: I0930 07:50:44.385113 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Sep 30 07:50:44 crc kubenswrapper[4760]: I0930 07:50:44.412825 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 07:50:44 crc kubenswrapper[4760]: I0930 07:50:44.486691 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Sep 30 07:50:44 crc kubenswrapper[4760]: I0930 07:50:44.486744 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Sep 30 07:50:44 crc kubenswrapper[4760]: I0930 07:50:44.559325 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Sep 30 07:50:44 crc kubenswrapper[4760]: I0930 07:50:44.594015 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Sep 30 07:50:44 crc kubenswrapper[4760]: I0930 07:50:44.715599 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:50:44 crc kubenswrapper[4760]: I0930 07:50:44.715964 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:50:44 crc kubenswrapper[4760]: I0930 07:50:44.725884 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-694db87c64-qrwhp" podUID="e28d9015-ea18-4da0-bfd8-c2cc5dec4f37" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.157:8443: connect: connection refused" Sep 30 07:50:44 crc kubenswrapper[4760]: I0930 07:50:44.778329 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-75644c8bb4-wrsmv" Sep 30 07:50:44 crc kubenswrapper[4760]: I0930 07:50:44.778382 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-75644c8bb4-wrsmv" Sep 30 07:50:44 crc kubenswrapper[4760]: I0930 07:50:44.782220 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-75644c8bb4-wrsmv" podUID="8b39ba3e-25df-4a22-a1fe-f15e6ca1fada" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.158:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.158:8443: connect: connection refused" Sep 30 07:50:45 crc kubenswrapper[4760]: I0930 07:50:45.175625 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8a3b79f1-3bbe-4418-9a81-4190a602dd5d","Type":"ContainerStarted","Data":"1e8e3e79719309f34e6554e6c8f0f765444bd702081e37346359ef0cce60c306"} Sep 30 07:50:45 crc kubenswrapper[4760]: I0930 07:50:45.176636 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8a3b79f1-3bbe-4418-9a81-4190a602dd5d" containerName="glance-log" containerID="cri-o://0c02b923b1bc6811bca16a8db1ba05951c985fbaa73046709d6d23369e97b202" gracePeriod=30 Sep 30 07:50:45 crc kubenswrapper[4760]: I0930 07:50:45.177276 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8a3b79f1-3bbe-4418-9a81-4190a602dd5d" containerName="glance-httpd" containerID="cri-o://1e8e3e79719309f34e6554e6c8f0f765444bd702081e37346359ef0cce60c306" gracePeriod=30 Sep 30 07:50:45 crc kubenswrapper[4760]: I0930 07:50:45.193447 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c588f954-a6b5-413f-bcc8-dbfe4c660d1b","Type":"ContainerStarted","Data":"1b7a1cd447b1a1865709f8a7edfad76422b4f9a2d7fee9eb718cfc11a99f68ce"} Sep 30 07:50:45 crc kubenswrapper[4760]: I0930 07:50:45.193653 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c588f954-a6b5-413f-bcc8-dbfe4c660d1b" containerName="glance-httpd" containerID="cri-o://093cefb45788d10e4b3981293660f4ed11758f770edd6a8e4b4ad49f359f4aa9" gracePeriod=30 Sep 30 07:50:45 crc kubenswrapper[4760]: I0930 07:50:45.193643 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c588f954-a6b5-413f-bcc8-dbfe4c660d1b" containerName="glance-log" containerID="cri-o://1b7a1cd447b1a1865709f8a7edfad76422b4f9a2d7fee9eb718cfc11a99f68ce" gracePeriod=30 Sep 30 07:50:45 crc kubenswrapper[4760]: I0930 07:50:45.211234 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=13.211212099 podStartE2EDuration="13.211212099s" podCreationTimestamp="2025-09-30 07:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:50:45.19518526 +0000 UTC m=+1030.838091672" watchObservedRunningTime="2025-09-30 07:50:45.211212099 +0000 UTC m=+1030.854118511" Sep 30 07:50:45 crc kubenswrapper[4760]: I0930 07:50:45.223325 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-844b758db4-hzncj" event={"ID":"a6cfc37b-8ee0-4efe-a43f-b53bafbf4255","Type":"ContainerStarted","Data":"446b064845665ae8614e7f8249c2f003d502f0592044558ccfec25a673481987"} Sep 30 07:50:45 crc kubenswrapper[4760]: I0930 07:50:45.224606 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-844b758db4-hzncj" Sep 30 07:50:45 crc kubenswrapper[4760]: I0930 07:50:45.224634 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-844b758db4-hzncj" Sep 30 07:50:45 crc kubenswrapper[4760]: I0930 07:50:45.227285 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=12.227269368 podStartE2EDuration="12.227269368s" podCreationTimestamp="2025-09-30 07:50:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:50:45.222031264 +0000 UTC m=+1030.864937676" watchObservedRunningTime="2025-09-30 07:50:45.227269368 +0000 UTC m=+1030.870175780" Sep 30 07:50:45 crc kubenswrapper[4760]: I0930 07:50:45.238855 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-77987c8bb7-t2mw2" event={"ID":"9d49cf7a-b821-4677-88fe-8fac1dbced63","Type":"ContainerStarted","Data":"ededf4c42334934cbcd921e674da6e8e5455a3e6f2be4634300c72b6c7d40305"} Sep 30 07:50:45 crc kubenswrapper[4760]: I0930 07:50:45.238897 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Sep 30 07:50:45 crc kubenswrapper[4760]: I0930 07:50:45.238908 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:50:45 crc kubenswrapper[4760]: I0930 07:50:45.254324 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-844b758db4-hzncj" podStartSLOduration=3.254290117 podStartE2EDuration="3.254290117s" podCreationTimestamp="2025-09-30 07:50:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:50:45.242736282 +0000 UTC m=+1030.885642694" watchObservedRunningTime="2025-09-30 07:50:45.254290117 +0000 UTC m=+1030.897196529" Sep 30 07:50:45 crc kubenswrapper[4760]: I0930 07:50:45.287692 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Sep 30 07:50:45 crc kubenswrapper[4760]: I0930 07:50:45.326196 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-77987c8bb7-t2mw2" podStartSLOduration=3.32617901 podStartE2EDuration="3.32617901s" podCreationTimestamp="2025-09-30 07:50:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:50:45.2854032 +0000 UTC m=+1030.928309622" watchObservedRunningTime="2025-09-30 07:50:45.32617901 +0000 UTC m=+1030.969085422" Sep 30 07:50:45 crc kubenswrapper[4760]: I0930 07:50:45.399776 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Sep 30 07:50:45 crc kubenswrapper[4760]: I0930 07:50:45.963018 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.066966 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-combined-ca-bundle\") pod \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.067016 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-config-data\") pod \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.067083 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.067118 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-logs\") pod \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.067250 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-scripts\") pod \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.067296 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2ksk\" (UniqueName: \"kubernetes.io/projected/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-kube-api-access-m2ksk\") pod \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.067340 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-httpd-run\") pod \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\" (UID: \"c588f954-a6b5-413f-bcc8-dbfe4c660d1b\") " Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.068474 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-logs" (OuterVolumeSpecName: "logs") pod "c588f954-a6b5-413f-bcc8-dbfe4c660d1b" (UID: "c588f954-a6b5-413f-bcc8-dbfe4c660d1b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.068696 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c588f954-a6b5-413f-bcc8-dbfe4c660d1b" (UID: "c588f954-a6b5-413f-bcc8-dbfe4c660d1b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.086036 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-scripts" (OuterVolumeSpecName: "scripts") pod "c588f954-a6b5-413f-bcc8-dbfe4c660d1b" (UID: "c588f954-a6b5-413f-bcc8-dbfe4c660d1b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.111427 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "c588f954-a6b5-413f-bcc8-dbfe4c660d1b" (UID: "c588f954-a6b5-413f-bcc8-dbfe4c660d1b"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.111563 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-kube-api-access-m2ksk" (OuterVolumeSpecName: "kube-api-access-m2ksk") pod "c588f954-a6b5-413f-bcc8-dbfe4c660d1b" (UID: "c588f954-a6b5-413f-bcc8-dbfe4c660d1b"). InnerVolumeSpecName "kube-api-access-m2ksk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.123879 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c588f954-a6b5-413f-bcc8-dbfe4c660d1b" (UID: "c588f954-a6b5-413f-bcc8-dbfe4c660d1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.139836 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-config-data" (OuterVolumeSpecName: "config-data") pod "c588f954-a6b5-413f-bcc8-dbfe4c660d1b" (UID: "c588f954-a6b5-413f-bcc8-dbfe4c660d1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.169810 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.169843 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.169863 4760 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.169872 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-logs\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.169882 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.169891 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2ksk\" (UniqueName: \"kubernetes.io/projected/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-kube-api-access-m2ksk\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.169901 4760 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c588f954-a6b5-413f-bcc8-dbfe4c660d1b-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.214820 4760 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.248236 4760 generic.go:334] "Generic (PLEG): container finished" podID="8a3b79f1-3bbe-4418-9a81-4190a602dd5d" containerID="0c02b923b1bc6811bca16a8db1ba05951c985fbaa73046709d6d23369e97b202" exitCode=143 Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.248335 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8a3b79f1-3bbe-4418-9a81-4190a602dd5d","Type":"ContainerDied","Data":"0c02b923b1bc6811bca16a8db1ba05951c985fbaa73046709d6d23369e97b202"} Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.250630 4760 generic.go:334] "Generic (PLEG): container finished" podID="c588f954-a6b5-413f-bcc8-dbfe4c660d1b" containerID="093cefb45788d10e4b3981293660f4ed11758f770edd6a8e4b4ad49f359f4aa9" exitCode=143 Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.250651 4760 generic.go:334] "Generic (PLEG): container finished" podID="c588f954-a6b5-413f-bcc8-dbfe4c660d1b" containerID="1b7a1cd447b1a1865709f8a7edfad76422b4f9a2d7fee9eb718cfc11a99f68ce" exitCode=143 Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.250692 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.250752 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c588f954-a6b5-413f-bcc8-dbfe4c660d1b","Type":"ContainerDied","Data":"093cefb45788d10e4b3981293660f4ed11758f770edd6a8e4b4ad49f359f4aa9"} Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.250777 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c588f954-a6b5-413f-bcc8-dbfe4c660d1b","Type":"ContainerDied","Data":"1b7a1cd447b1a1865709f8a7edfad76422b4f9a2d7fee9eb718cfc11a99f68ce"} Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.250789 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c588f954-a6b5-413f-bcc8-dbfe4c660d1b","Type":"ContainerDied","Data":"ef9d2be6e5aa8cb925213a3b56f37367a3e3ce662a468e00b2da2bda3a7274f7"} Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.250803 4760 scope.go:117] "RemoveContainer" containerID="093cefb45788d10e4b3981293660f4ed11758f770edd6a8e4b4ad49f359f4aa9" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.271145 4760 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.292137 4760 scope.go:117] "RemoveContainer" containerID="1b7a1cd447b1a1865709f8a7edfad76422b4f9a2d7fee9eb718cfc11a99f68ce" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.294242 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.321296 4760 scope.go:117] "RemoveContainer" containerID="093cefb45788d10e4b3981293660f4ed11758f770edd6a8e4b4ad49f359f4aa9" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.328719 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 07:50:46 crc kubenswrapper[4760]: E0930 07:50:46.329024 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"093cefb45788d10e4b3981293660f4ed11758f770edd6a8e4b4ad49f359f4aa9\": container with ID starting with 093cefb45788d10e4b3981293660f4ed11758f770edd6a8e4b4ad49f359f4aa9 not found: ID does not exist" containerID="093cefb45788d10e4b3981293660f4ed11758f770edd6a8e4b4ad49f359f4aa9" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.329065 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"093cefb45788d10e4b3981293660f4ed11758f770edd6a8e4b4ad49f359f4aa9"} err="failed to get container status \"093cefb45788d10e4b3981293660f4ed11758f770edd6a8e4b4ad49f359f4aa9\": rpc error: code = NotFound desc = could not find container \"093cefb45788d10e4b3981293660f4ed11758f770edd6a8e4b4ad49f359f4aa9\": container with ID starting with 093cefb45788d10e4b3981293660f4ed11758f770edd6a8e4b4ad49f359f4aa9 not found: ID does not exist" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.329089 4760 scope.go:117] "RemoveContainer" containerID="1b7a1cd447b1a1865709f8a7edfad76422b4f9a2d7fee9eb718cfc11a99f68ce" Sep 30 07:50:46 crc kubenswrapper[4760]: E0930 07:50:46.329348 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b7a1cd447b1a1865709f8a7edfad76422b4f9a2d7fee9eb718cfc11a99f68ce\": container with ID starting with 1b7a1cd447b1a1865709f8a7edfad76422b4f9a2d7fee9eb718cfc11a99f68ce not found: ID does not exist" containerID="1b7a1cd447b1a1865709f8a7edfad76422b4f9a2d7fee9eb718cfc11a99f68ce" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.329370 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b7a1cd447b1a1865709f8a7edfad76422b4f9a2d7fee9eb718cfc11a99f68ce"} err="failed to get container status \"1b7a1cd447b1a1865709f8a7edfad76422b4f9a2d7fee9eb718cfc11a99f68ce\": rpc error: code = NotFound desc = could not find container \"1b7a1cd447b1a1865709f8a7edfad76422b4f9a2d7fee9eb718cfc11a99f68ce\": container with ID starting with 1b7a1cd447b1a1865709f8a7edfad76422b4f9a2d7fee9eb718cfc11a99f68ce not found: ID does not exist" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.329383 4760 scope.go:117] "RemoveContainer" containerID="093cefb45788d10e4b3981293660f4ed11758f770edd6a8e4b4ad49f359f4aa9" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.329541 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"093cefb45788d10e4b3981293660f4ed11758f770edd6a8e4b4ad49f359f4aa9"} err="failed to get container status \"093cefb45788d10e4b3981293660f4ed11758f770edd6a8e4b4ad49f359f4aa9\": rpc error: code = NotFound desc = could not find container \"093cefb45788d10e4b3981293660f4ed11758f770edd6a8e4b4ad49f359f4aa9\": container with ID starting with 093cefb45788d10e4b3981293660f4ed11758f770edd6a8e4b4ad49f359f4aa9 not found: ID does not exist" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.329559 4760 scope.go:117] "RemoveContainer" containerID="1b7a1cd447b1a1865709f8a7edfad76422b4f9a2d7fee9eb718cfc11a99f68ce" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.329704 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b7a1cd447b1a1865709f8a7edfad76422b4f9a2d7fee9eb718cfc11a99f68ce"} err="failed to get container status \"1b7a1cd447b1a1865709f8a7edfad76422b4f9a2d7fee9eb718cfc11a99f68ce\": rpc error: code = NotFound desc = could not find container \"1b7a1cd447b1a1865709f8a7edfad76422b4f9a2d7fee9eb718cfc11a99f68ce\": container with ID starting with 1b7a1cd447b1a1865709f8a7edfad76422b4f9a2d7fee9eb718cfc11a99f68ce not found: ID does not exist" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.367833 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 07:50:46 crc kubenswrapper[4760]: E0930 07:50:46.369765 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c588f954-a6b5-413f-bcc8-dbfe4c660d1b" containerName="glance-log" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.369792 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="c588f954-a6b5-413f-bcc8-dbfe4c660d1b" containerName="glance-log" Sep 30 07:50:46 crc kubenswrapper[4760]: E0930 07:50:46.369802 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c588f954-a6b5-413f-bcc8-dbfe4c660d1b" containerName="glance-httpd" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.369809 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="c588f954-a6b5-413f-bcc8-dbfe4c660d1b" containerName="glance-httpd" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.370061 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="c588f954-a6b5-413f-bcc8-dbfe4c660d1b" containerName="glance-httpd" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.370088 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="c588f954-a6b5-413f-bcc8-dbfe4c660d1b" containerName="glance-log" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.374871 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.375034 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.385887 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.386213 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.479433 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbf081e8-29d3-46ed-8474-e027d6c28c1d-logs\") pod \"glance-default-internal-api-0\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.480931 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bbf081e8-29d3-46ed-8474-e027d6c28c1d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.481098 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7pf5\" (UniqueName: \"kubernetes.io/projected/bbf081e8-29d3-46ed-8474-e027d6c28c1d-kube-api-access-g7pf5\") pod \"glance-default-internal-api-0\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.481127 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbf081e8-29d3-46ed-8474-e027d6c28c1d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.481169 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.481200 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbf081e8-29d3-46ed-8474-e027d6c28c1d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.481224 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf081e8-29d3-46ed-8474-e027d6c28c1d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.481278 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf081e8-29d3-46ed-8474-e027d6c28c1d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.583395 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.583624 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbf081e8-29d3-46ed-8474-e027d6c28c1d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.583652 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf081e8-29d3-46ed-8474-e027d6c28c1d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.583689 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf081e8-29d3-46ed-8474-e027d6c28c1d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.583721 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbf081e8-29d3-46ed-8474-e027d6c28c1d-logs\") pod \"glance-default-internal-api-0\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.583743 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bbf081e8-29d3-46ed-8474-e027d6c28c1d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.583817 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.583820 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7pf5\" (UniqueName: \"kubernetes.io/projected/bbf081e8-29d3-46ed-8474-e027d6c28c1d-kube-api-access-g7pf5\") pod \"glance-default-internal-api-0\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.584996 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbf081e8-29d3-46ed-8474-e027d6c28c1d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.585288 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bbf081e8-29d3-46ed-8474-e027d6c28c1d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.586554 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbf081e8-29d3-46ed-8474-e027d6c28c1d-logs\") pod \"glance-default-internal-api-0\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.593867 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbf081e8-29d3-46ed-8474-e027d6c28c1d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.590900 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf081e8-29d3-46ed-8474-e027d6c28c1d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.606064 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbf081e8-29d3-46ed-8474-e027d6c28c1d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.609340 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf081e8-29d3-46ed-8474-e027d6c28c1d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.612904 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7pf5\" (UniqueName: \"kubernetes.io/projected/bbf081e8-29d3-46ed-8474-e027d6c28c1d-kube-api-access-g7pf5\") pod \"glance-default-internal-api-0\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.654525 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:50:46 crc kubenswrapper[4760]: I0930 07:50:46.708273 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.086021 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c588f954-a6b5-413f-bcc8-dbfe4c660d1b" path="/var/lib/kubelet/pods/c588f954-a6b5-413f-bcc8-dbfe4c660d1b/volumes" Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.271785 4760 generic.go:334] "Generic (PLEG): container finished" podID="8a3b79f1-3bbe-4418-9a81-4190a602dd5d" containerID="1e8e3e79719309f34e6554e6c8f0f765444bd702081e37346359ef0cce60c306" exitCode=0 Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.271854 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8a3b79f1-3bbe-4418-9a81-4190a602dd5d","Type":"ContainerDied","Data":"1e8e3e79719309f34e6554e6c8f0f765444bd702081e37346359ef0cce60c306"} Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.298186 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.673864 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.724431 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-httpd-run\") pod \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.724520 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-logs\") pod \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.724634 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-config-data\") pod \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.724711 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-combined-ca-bundle\") pod \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.724734 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8rkd\" (UniqueName: \"kubernetes.io/projected/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-kube-api-access-r8rkd\") pod \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.724760 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.724858 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-scripts\") pod \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\" (UID: \"8a3b79f1-3bbe-4418-9a81-4190a602dd5d\") " Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.726071 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8a3b79f1-3bbe-4418-9a81-4190a602dd5d" (UID: "8a3b79f1-3bbe-4418-9a81-4190a602dd5d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.726255 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-logs" (OuterVolumeSpecName: "logs") pod "8a3b79f1-3bbe-4418-9a81-4190a602dd5d" (UID: "8a3b79f1-3bbe-4418-9a81-4190a602dd5d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.731478 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-scripts" (OuterVolumeSpecName: "scripts") pod "8a3b79f1-3bbe-4418-9a81-4190a602dd5d" (UID: "8a3b79f1-3bbe-4418-9a81-4190a602dd5d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.735033 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-kube-api-access-r8rkd" (OuterVolumeSpecName: "kube-api-access-r8rkd") pod "8a3b79f1-3bbe-4418-9a81-4190a602dd5d" (UID: "8a3b79f1-3bbe-4418-9a81-4190a602dd5d"). InnerVolumeSpecName "kube-api-access-r8rkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.735493 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "8a3b79f1-3bbe-4418-9a81-4190a602dd5d" (UID: "8a3b79f1-3bbe-4418-9a81-4190a602dd5d"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.764415 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a3b79f1-3bbe-4418-9a81-4190a602dd5d" (UID: "8a3b79f1-3bbe-4418-9a81-4190a602dd5d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.796505 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-config-data" (OuterVolumeSpecName: "config-data") pod "8a3b79f1-3bbe-4418-9a81-4190a602dd5d" (UID: "8a3b79f1-3bbe-4418-9a81-4190a602dd5d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.827412 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.827438 4760 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.827449 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-logs\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.827457 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.827465 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.827475 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8rkd\" (UniqueName: \"kubernetes.io/projected/8a3b79f1-3bbe-4418-9a81-4190a602dd5d-kube-api-access-r8rkd\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.827500 4760 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.856065 4760 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Sep 30 07:50:47 crc kubenswrapper[4760]: I0930 07:50:47.929783 4760 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.070569 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.071082 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="f1e2802c-ec39-48ee-a6c4-b609a57aa0c5" containerName="watcher-api-log" containerID="cri-o://8bb4ae2b4cdc4f4f073c4d86092669e02429c7f687807aac45ca0ee0b9dbe8d5" gracePeriod=30 Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.071510 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="f1e2802c-ec39-48ee-a6c4-b609a57aa0c5" containerName="watcher-api" containerID="cri-o://f0aeb72fd3b5bc55a798002692855ffad1e3d949c0cd84e5255f5d2652bc2d77" gracePeriod=30 Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.287669 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.289242 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8a3b79f1-3bbe-4418-9a81-4190a602dd5d","Type":"ContainerDied","Data":"ed00058a5fde843bca2ea5750acc9a0c793259cd6244a13d4bf73c540bddf156"} Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.289293 4760 scope.go:117] "RemoveContainer" containerID="1e8e3e79719309f34e6554e6c8f0f765444bd702081e37346359ef0cce60c306" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.292516 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bbf081e8-29d3-46ed-8474-e027d6c28c1d","Type":"ContainerStarted","Data":"495694d991a90664ce4b5d0bd317dffe03e9480518f5da9230a6675bd99314b1"} Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.292560 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bbf081e8-29d3-46ed-8474-e027d6c28c1d","Type":"ContainerStarted","Data":"0e948272245e33338d045716cc962b620641c70744289219859349abb8715afa"} Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.302174 4760 generic.go:334] "Generic (PLEG): container finished" podID="f1e2802c-ec39-48ee-a6c4-b609a57aa0c5" containerID="8bb4ae2b4cdc4f4f073c4d86092669e02429c7f687807aac45ca0ee0b9dbe8d5" exitCode=143 Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.302214 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5","Type":"ContainerDied","Data":"8bb4ae2b4cdc4f4f073c4d86092669e02429c7f687807aac45ca0ee0b9dbe8d5"} Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.363168 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.374230 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.391011 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 07:50:48 crc kubenswrapper[4760]: E0930 07:50:48.391386 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a3b79f1-3bbe-4418-9a81-4190a602dd5d" containerName="glance-httpd" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.391404 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3b79f1-3bbe-4418-9a81-4190a602dd5d" containerName="glance-httpd" Sep 30 07:50:48 crc kubenswrapper[4760]: E0930 07:50:48.391419 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a3b79f1-3bbe-4418-9a81-4190a602dd5d" containerName="glance-log" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.391425 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3b79f1-3bbe-4418-9a81-4190a602dd5d" containerName="glance-log" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.391636 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a3b79f1-3bbe-4418-9a81-4190a602dd5d" containerName="glance-log" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.391656 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a3b79f1-3bbe-4418-9a81-4190a602dd5d" containerName="glance-httpd" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.392732 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.396141 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.399005 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.409089 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.440468 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpq5h\" (UniqueName: \"kubernetes.io/projected/dec298fa-4de5-4a26-bc21-409707df4ddb-kube-api-access-dpq5h\") pod \"glance-default-external-api-0\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.440547 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec298fa-4de5-4a26-bc21-409707df4ddb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.440565 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dec298fa-4de5-4a26-bc21-409707df4ddb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.440599 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dec298fa-4de5-4a26-bc21-409707df4ddb-scripts\") pod \"glance-default-external-api-0\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.440650 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dec298fa-4de5-4a26-bc21-409707df4ddb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.440667 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dec298fa-4de5-4a26-bc21-409707df4ddb-logs\") pod \"glance-default-external-api-0\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.440683 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec298fa-4de5-4a26-bc21-409707df4ddb-config-data\") pod \"glance-default-external-api-0\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.440706 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.458469 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.522851 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-j5p4t"] Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.523135 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" podUID="da7c2ace-093d-4279-bad0-5f2876f4ab8d" containerName="dnsmasq-dns" containerID="cri-o://bcf47ca8284d148fa2358a3feb62e90bc3ea5d9a074723dd52bfb40d5a484182" gracePeriod=10 Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.542621 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dec298fa-4de5-4a26-bc21-409707df4ddb-scripts\") pod \"glance-default-external-api-0\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.542738 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dec298fa-4de5-4a26-bc21-409707df4ddb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.542757 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dec298fa-4de5-4a26-bc21-409707df4ddb-logs\") pod \"glance-default-external-api-0\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.542788 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec298fa-4de5-4a26-bc21-409707df4ddb-config-data\") pod \"glance-default-external-api-0\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.542810 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.542928 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpq5h\" (UniqueName: \"kubernetes.io/projected/dec298fa-4de5-4a26-bc21-409707df4ddb-kube-api-access-dpq5h\") pod \"glance-default-external-api-0\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.542974 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec298fa-4de5-4a26-bc21-409707df4ddb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.542989 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dec298fa-4de5-4a26-bc21-409707df4ddb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.544890 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.547538 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dec298fa-4de5-4a26-bc21-409707df4ddb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.549188 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dec298fa-4de5-4a26-bc21-409707df4ddb-logs\") pod \"glance-default-external-api-0\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.550371 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec298fa-4de5-4a26-bc21-409707df4ddb-config-data\") pod \"glance-default-external-api-0\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.561145 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dec298fa-4de5-4a26-bc21-409707df4ddb-scripts\") pod \"glance-default-external-api-0\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.569267 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dec298fa-4de5-4a26-bc21-409707df4ddb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.591339 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec298fa-4de5-4a26-bc21-409707df4ddb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.591876 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpq5h\" (UniqueName: \"kubernetes.io/projected/dec298fa-4de5-4a26-bc21-409707df4ddb-kube-api-access-dpq5h\") pod \"glance-default-external-api-0\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.610611 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " pod="openstack/glance-default-external-api-0" Sep 30 07:50:48 crc kubenswrapper[4760]: I0930 07:50:48.735501 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 07:50:49 crc kubenswrapper[4760]: I0930 07:50:49.083143 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a3b79f1-3bbe-4418-9a81-4190a602dd5d" path="/var/lib/kubelet/pods/8a3b79f1-3bbe-4418-9a81-4190a602dd5d/volumes" Sep 30 07:50:49 crc kubenswrapper[4760]: I0930 07:50:49.322819 4760 generic.go:334] "Generic (PLEG): container finished" podID="da7c2ace-093d-4279-bad0-5f2876f4ab8d" containerID="bcf47ca8284d148fa2358a3feb62e90bc3ea5d9a074723dd52bfb40d5a484182" exitCode=0 Sep 30 07:50:49 crc kubenswrapper[4760]: I0930 07:50:49.323426 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" event={"ID":"da7c2ace-093d-4279-bad0-5f2876f4ab8d","Type":"ContainerDied","Data":"bcf47ca8284d148fa2358a3feb62e90bc3ea5d9a074723dd52bfb40d5a484182"} Sep 30 07:50:49 crc kubenswrapper[4760]: I0930 07:50:49.325546 4760 generic.go:334] "Generic (PLEG): container finished" podID="1df67641-4598-4ba5-a59a-a195084e5446" containerID="b441e987ef7decf049664c413470ea11bd7c6b9be8c56175022120bb2f98ef6f" exitCode=0 Sep 30 07:50:49 crc kubenswrapper[4760]: I0930 07:50:49.325644 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f6j6q" event={"ID":"1df67641-4598-4ba5-a59a-a195084e5446","Type":"ContainerDied","Data":"b441e987ef7decf049664c413470ea11bd7c6b9be8c56175022120bb2f98ef6f"} Sep 30 07:50:51 crc kubenswrapper[4760]: I0930 07:50:51.049501 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" podUID="da7c2ace-093d-4279-bad0-5f2876f4ab8d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.149:5353: connect: connection refused" Sep 30 07:50:51 crc kubenswrapper[4760]: I0930 07:50:51.216569 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="f1e2802c-ec39-48ee-a6c4-b609a57aa0c5" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.154:9322/\": read tcp 10.217.0.2:51646->10.217.0.154:9322: read: connection reset by peer" Sep 30 07:50:51 crc kubenswrapper[4760]: I0930 07:50:51.217261 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="f1e2802c-ec39-48ee-a6c4-b609a57aa0c5" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.154:9322/\": read tcp 10.217.0.2:51662->10.217.0.154:9322: read: connection reset by peer" Sep 30 07:50:51 crc kubenswrapper[4760]: I0930 07:50:51.354911 4760 generic.go:334] "Generic (PLEG): container finished" podID="f1e2802c-ec39-48ee-a6c4-b609a57aa0c5" containerID="f0aeb72fd3b5bc55a798002692855ffad1e3d949c0cd84e5255f5d2652bc2d77" exitCode=0 Sep 30 07:50:51 crc kubenswrapper[4760]: I0930 07:50:51.355238 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5","Type":"ContainerDied","Data":"f0aeb72fd3b5bc55a798002692855ffad1e3d949c0cd84e5255f5d2652bc2d77"} Sep 30 07:50:53 crc kubenswrapper[4760]: I0930 07:50:53.086473 4760 scope.go:117] "RemoveContainer" containerID="0c02b923b1bc6811bca16a8db1ba05951c985fbaa73046709d6d23369e97b202" Sep 30 07:50:53 crc kubenswrapper[4760]: I0930 07:50:53.229202 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f6j6q" Sep 30 07:50:53 crc kubenswrapper[4760]: I0930 07:50:53.338546 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f75np\" (UniqueName: \"kubernetes.io/projected/1df67641-4598-4ba5-a59a-a195084e5446-kube-api-access-f75np\") pod \"1df67641-4598-4ba5-a59a-a195084e5446\" (UID: \"1df67641-4598-4ba5-a59a-a195084e5446\") " Sep 30 07:50:53 crc kubenswrapper[4760]: I0930 07:50:53.338901 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1df67641-4598-4ba5-a59a-a195084e5446-db-sync-config-data\") pod \"1df67641-4598-4ba5-a59a-a195084e5446\" (UID: \"1df67641-4598-4ba5-a59a-a195084e5446\") " Sep 30 07:50:53 crc kubenswrapper[4760]: I0930 07:50:53.338978 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df67641-4598-4ba5-a59a-a195084e5446-combined-ca-bundle\") pod \"1df67641-4598-4ba5-a59a-a195084e5446\" (UID: \"1df67641-4598-4ba5-a59a-a195084e5446\") " Sep 30 07:50:53 crc kubenswrapper[4760]: I0930 07:50:53.344852 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1df67641-4598-4ba5-a59a-a195084e5446-kube-api-access-f75np" (OuterVolumeSpecName: "kube-api-access-f75np") pod "1df67641-4598-4ba5-a59a-a195084e5446" (UID: "1df67641-4598-4ba5-a59a-a195084e5446"). InnerVolumeSpecName "kube-api-access-f75np". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:50:53 crc kubenswrapper[4760]: I0930 07:50:53.359588 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1df67641-4598-4ba5-a59a-a195084e5446-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1df67641-4598-4ba5-a59a-a195084e5446" (UID: "1df67641-4598-4ba5-a59a-a195084e5446"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:53 crc kubenswrapper[4760]: I0930 07:50:53.367875 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1df67641-4598-4ba5-a59a-a195084e5446-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1df67641-4598-4ba5-a59a-a195084e5446" (UID: "1df67641-4598-4ba5-a59a-a195084e5446"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:50:53 crc kubenswrapper[4760]: I0930 07:50:53.385014 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f6j6q" event={"ID":"1df67641-4598-4ba5-a59a-a195084e5446","Type":"ContainerDied","Data":"7b95bf520c48586f4386da3ad833eb819f74f25e4d61c4e176a5cc33f365f080"} Sep 30 07:50:53 crc kubenswrapper[4760]: I0930 07:50:53.385046 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b95bf520c48586f4386da3ad833eb819f74f25e4d61c4e176a5cc33f365f080" Sep 30 07:50:53 crc kubenswrapper[4760]: I0930 07:50:53.385046 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f6j6q" Sep 30 07:50:53 crc kubenswrapper[4760]: I0930 07:50:53.441484 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df67641-4598-4ba5-a59a-a195084e5446-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:53 crc kubenswrapper[4760]: I0930 07:50:53.441516 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f75np\" (UniqueName: \"kubernetes.io/projected/1df67641-4598-4ba5-a59a-a195084e5446-kube-api-access-f75np\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:53 crc kubenswrapper[4760]: I0930 07:50:53.441529 4760 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1df67641-4598-4ba5-a59a-a195084e5446-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.494456 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-554bb7d464-zcqc8"] Sep 30 07:50:54 crc kubenswrapper[4760]: E0930 07:50:54.495069 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1df67641-4598-4ba5-a59a-a195084e5446" containerName="barbican-db-sync" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.495080 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df67641-4598-4ba5-a59a-a195084e5446" containerName="barbican-db-sync" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.495273 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="1df67641-4598-4ba5-a59a-a195084e5446" containerName="barbican-db-sync" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.496508 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-554bb7d464-zcqc8" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.501316 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-59dqn" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.501586 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.501895 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.525423 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-8479dd9dbc-25wxx"] Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.526871 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-8479dd9dbc-25wxx" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.530941 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.557752 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/968a427d-0cee-4775-ab7f-4ec27e535b33-logs\") pod \"barbican-keystone-listener-554bb7d464-zcqc8\" (UID: \"968a427d-0cee-4775-ab7f-4ec27e535b33\") " pod="openstack/barbican-keystone-listener-554bb7d464-zcqc8" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.557800 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/968a427d-0cee-4775-ab7f-4ec27e535b33-config-data-custom\") pod \"barbican-keystone-listener-554bb7d464-zcqc8\" (UID: \"968a427d-0cee-4775-ab7f-4ec27e535b33\") " pod="openstack/barbican-keystone-listener-554bb7d464-zcqc8" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.557851 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/968a427d-0cee-4775-ab7f-4ec27e535b33-combined-ca-bundle\") pod \"barbican-keystone-listener-554bb7d464-zcqc8\" (UID: \"968a427d-0cee-4775-ab7f-4ec27e535b33\") " pod="openstack/barbican-keystone-listener-554bb7d464-zcqc8" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.557951 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/968a427d-0cee-4775-ab7f-4ec27e535b33-config-data\") pod \"barbican-keystone-listener-554bb7d464-zcqc8\" (UID: \"968a427d-0cee-4775-ab7f-4ec27e535b33\") " pod="openstack/barbican-keystone-listener-554bb7d464-zcqc8" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.558044 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdsrr\" (UniqueName: \"kubernetes.io/projected/968a427d-0cee-4775-ab7f-4ec27e535b33-kube-api-access-zdsrr\") pod \"barbican-keystone-listener-554bb7d464-zcqc8\" (UID: \"968a427d-0cee-4775-ab7f-4ec27e535b33\") " pod="openstack/barbican-keystone-listener-554bb7d464-zcqc8" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.560972 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-8479dd9dbc-25wxx"] Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.582359 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-554bb7d464-zcqc8"] Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.666368 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-fvhgj"] Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.675676 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d66f584d7-fvhgj" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.693418 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/968a427d-0cee-4775-ab7f-4ec27e535b33-combined-ca-bundle\") pod \"barbican-keystone-listener-554bb7d464-zcqc8\" (UID: \"968a427d-0cee-4775-ab7f-4ec27e535b33\") " pod="openstack/barbican-keystone-listener-554bb7d464-zcqc8" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.693520 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/caf10164-5c77-42df-9fdc-b6a1764a0e3d-config-data-custom\") pod \"barbican-worker-8479dd9dbc-25wxx\" (UID: \"caf10164-5c77-42df-9fdc-b6a1764a0e3d\") " pod="openstack/barbican-worker-8479dd9dbc-25wxx" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.693648 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caf10164-5c77-42df-9fdc-b6a1764a0e3d-config-data\") pod \"barbican-worker-8479dd9dbc-25wxx\" (UID: \"caf10164-5c77-42df-9fdc-b6a1764a0e3d\") " pod="openstack/barbican-worker-8479dd9dbc-25wxx" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.693669 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xz5d\" (UniqueName: \"kubernetes.io/projected/caf10164-5c77-42df-9fdc-b6a1764a0e3d-kube-api-access-9xz5d\") pod \"barbican-worker-8479dd9dbc-25wxx\" (UID: \"caf10164-5c77-42df-9fdc-b6a1764a0e3d\") " pod="openstack/barbican-worker-8479dd9dbc-25wxx" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.693701 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/968a427d-0cee-4775-ab7f-4ec27e535b33-config-data\") pod \"barbican-keystone-listener-554bb7d464-zcqc8\" (UID: \"968a427d-0cee-4775-ab7f-4ec27e535b33\") " pod="openstack/barbican-keystone-listener-554bb7d464-zcqc8" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.693723 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/caf10164-5c77-42df-9fdc-b6a1764a0e3d-logs\") pod \"barbican-worker-8479dd9dbc-25wxx\" (UID: \"caf10164-5c77-42df-9fdc-b6a1764a0e3d\") " pod="openstack/barbican-worker-8479dd9dbc-25wxx" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.693860 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdsrr\" (UniqueName: \"kubernetes.io/projected/968a427d-0cee-4775-ab7f-4ec27e535b33-kube-api-access-zdsrr\") pod \"barbican-keystone-listener-554bb7d464-zcqc8\" (UID: \"968a427d-0cee-4775-ab7f-4ec27e535b33\") " pod="openstack/barbican-keystone-listener-554bb7d464-zcqc8" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.693917 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caf10164-5c77-42df-9fdc-b6a1764a0e3d-combined-ca-bundle\") pod \"barbican-worker-8479dd9dbc-25wxx\" (UID: \"caf10164-5c77-42df-9fdc-b6a1764a0e3d\") " pod="openstack/barbican-worker-8479dd9dbc-25wxx" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.693980 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/968a427d-0cee-4775-ab7f-4ec27e535b33-logs\") pod \"barbican-keystone-listener-554bb7d464-zcqc8\" (UID: \"968a427d-0cee-4775-ab7f-4ec27e535b33\") " pod="openstack/barbican-keystone-listener-554bb7d464-zcqc8" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.694006 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/968a427d-0cee-4775-ab7f-4ec27e535b33-config-data-custom\") pod \"barbican-keystone-listener-554bb7d464-zcqc8\" (UID: \"968a427d-0cee-4775-ab7f-4ec27e535b33\") " pod="openstack/barbican-keystone-listener-554bb7d464-zcqc8" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.699402 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/968a427d-0cee-4775-ab7f-4ec27e535b33-logs\") pod \"barbican-keystone-listener-554bb7d464-zcqc8\" (UID: \"968a427d-0cee-4775-ab7f-4ec27e535b33\") " pod="openstack/barbican-keystone-listener-554bb7d464-zcqc8" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.703835 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/968a427d-0cee-4775-ab7f-4ec27e535b33-config-data\") pod \"barbican-keystone-listener-554bb7d464-zcqc8\" (UID: \"968a427d-0cee-4775-ab7f-4ec27e535b33\") " pod="openstack/barbican-keystone-listener-554bb7d464-zcqc8" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.714938 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/968a427d-0cee-4775-ab7f-4ec27e535b33-config-data-custom\") pod \"barbican-keystone-listener-554bb7d464-zcqc8\" (UID: \"968a427d-0cee-4775-ab7f-4ec27e535b33\") " pod="openstack/barbican-keystone-listener-554bb7d464-zcqc8" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.716580 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/968a427d-0cee-4775-ab7f-4ec27e535b33-combined-ca-bundle\") pod \"barbican-keystone-listener-554bb7d464-zcqc8\" (UID: \"968a427d-0cee-4775-ab7f-4ec27e535b33\") " pod="openstack/barbican-keystone-listener-554bb7d464-zcqc8" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.716751 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-694db87c64-qrwhp" podUID="e28d9015-ea18-4da0-bfd8-c2cc5dec4f37" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.157:8443: connect: connection refused" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.717256 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-fvhgj"] Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.725878 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdsrr\" (UniqueName: \"kubernetes.io/projected/968a427d-0cee-4775-ab7f-4ec27e535b33-kube-api-access-zdsrr\") pod \"barbican-keystone-listener-554bb7d464-zcqc8\" (UID: \"968a427d-0cee-4775-ab7f-4ec27e535b33\") " pod="openstack/barbican-keystone-listener-554bb7d464-zcqc8" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.784511 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-75644c8bb4-wrsmv" podUID="8b39ba3e-25df-4a22-a1fe-f15e6ca1fada" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.158:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.158:8443: connect: connection refused" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.788509 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7c45496686-rqmqg"] Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.790178 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c45496686-rqmqg" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.791683 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.798942 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c45496686-rqmqg"] Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.799396 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-dns-svc\") pod \"dnsmasq-dns-6d66f584d7-fvhgj\" (UID: \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\") " pod="openstack/dnsmasq-dns-6d66f584d7-fvhgj" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.799597 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caf10164-5c77-42df-9fdc-b6a1764a0e3d-combined-ca-bundle\") pod \"barbican-worker-8479dd9dbc-25wxx\" (UID: \"caf10164-5c77-42df-9fdc-b6a1764a0e3d\") " pod="openstack/barbican-worker-8479dd9dbc-25wxx" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.799639 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-config\") pod \"dnsmasq-dns-6d66f584d7-fvhgj\" (UID: \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\") " pod="openstack/dnsmasq-dns-6d66f584d7-fvhgj" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.799679 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqggl\" (UniqueName: \"kubernetes.io/projected/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-kube-api-access-mqggl\") pod \"dnsmasq-dns-6d66f584d7-fvhgj\" (UID: \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\") " pod="openstack/dnsmasq-dns-6d66f584d7-fvhgj" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.799724 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-ovsdbserver-nb\") pod \"dnsmasq-dns-6d66f584d7-fvhgj\" (UID: \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\") " pod="openstack/dnsmasq-dns-6d66f584d7-fvhgj" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.799806 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/caf10164-5c77-42df-9fdc-b6a1764a0e3d-config-data-custom\") pod \"barbican-worker-8479dd9dbc-25wxx\" (UID: \"caf10164-5c77-42df-9fdc-b6a1764a0e3d\") " pod="openstack/barbican-worker-8479dd9dbc-25wxx" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.799880 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caf10164-5c77-42df-9fdc-b6a1764a0e3d-config-data\") pod \"barbican-worker-8479dd9dbc-25wxx\" (UID: \"caf10164-5c77-42df-9fdc-b6a1764a0e3d\") " pod="openstack/barbican-worker-8479dd9dbc-25wxx" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.799900 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xz5d\" (UniqueName: \"kubernetes.io/projected/caf10164-5c77-42df-9fdc-b6a1764a0e3d-kube-api-access-9xz5d\") pod \"barbican-worker-8479dd9dbc-25wxx\" (UID: \"caf10164-5c77-42df-9fdc-b6a1764a0e3d\") " pod="openstack/barbican-worker-8479dd9dbc-25wxx" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.799924 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/caf10164-5c77-42df-9fdc-b6a1764a0e3d-logs\") pod \"barbican-worker-8479dd9dbc-25wxx\" (UID: \"caf10164-5c77-42df-9fdc-b6a1764a0e3d\") " pod="openstack/barbican-worker-8479dd9dbc-25wxx" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.799976 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-dns-swift-storage-0\") pod \"dnsmasq-dns-6d66f584d7-fvhgj\" (UID: \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\") " pod="openstack/dnsmasq-dns-6d66f584d7-fvhgj" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.800017 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-ovsdbserver-sb\") pod \"dnsmasq-dns-6d66f584d7-fvhgj\" (UID: \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\") " pod="openstack/dnsmasq-dns-6d66f584d7-fvhgj" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.802410 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/caf10164-5c77-42df-9fdc-b6a1764a0e3d-logs\") pod \"barbican-worker-8479dd9dbc-25wxx\" (UID: \"caf10164-5c77-42df-9fdc-b6a1764a0e3d\") " pod="openstack/barbican-worker-8479dd9dbc-25wxx" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.805202 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caf10164-5c77-42df-9fdc-b6a1764a0e3d-combined-ca-bundle\") pod \"barbican-worker-8479dd9dbc-25wxx\" (UID: \"caf10164-5c77-42df-9fdc-b6a1764a0e3d\") " pod="openstack/barbican-worker-8479dd9dbc-25wxx" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.819250 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/caf10164-5c77-42df-9fdc-b6a1764a0e3d-config-data-custom\") pod \"barbican-worker-8479dd9dbc-25wxx\" (UID: \"caf10164-5c77-42df-9fdc-b6a1764a0e3d\") " pod="openstack/barbican-worker-8479dd9dbc-25wxx" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.825892 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xz5d\" (UniqueName: \"kubernetes.io/projected/caf10164-5c77-42df-9fdc-b6a1764a0e3d-kube-api-access-9xz5d\") pod \"barbican-worker-8479dd9dbc-25wxx\" (UID: \"caf10164-5c77-42df-9fdc-b6a1764a0e3d\") " pod="openstack/barbican-worker-8479dd9dbc-25wxx" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.827932 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caf10164-5c77-42df-9fdc-b6a1764a0e3d-config-data\") pod \"barbican-worker-8479dd9dbc-25wxx\" (UID: \"caf10164-5c77-42df-9fdc-b6a1764a0e3d\") " pod="openstack/barbican-worker-8479dd9dbc-25wxx" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.828645 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-554bb7d464-zcqc8" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.865724 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-8479dd9dbc-25wxx" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.901278 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-ovsdbserver-sb\") pod \"dnsmasq-dns-6d66f584d7-fvhgj\" (UID: \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\") " pod="openstack/dnsmasq-dns-6d66f584d7-fvhgj" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.901335 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-dns-svc\") pod \"dnsmasq-dns-6d66f584d7-fvhgj\" (UID: \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\") " pod="openstack/dnsmasq-dns-6d66f584d7-fvhgj" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.901360 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-config-data-custom\") pod \"barbican-api-7c45496686-rqmqg\" (UID: \"8adf3e3f-bdc8-4a02-90da-71e21926d9f9\") " pod="openstack/barbican-api-7c45496686-rqmqg" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.901390 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-config-data\") pod \"barbican-api-7c45496686-rqmqg\" (UID: \"8adf3e3f-bdc8-4a02-90da-71e21926d9f9\") " pod="openstack/barbican-api-7c45496686-rqmqg" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.901415 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-config\") pod \"dnsmasq-dns-6d66f584d7-fvhgj\" (UID: \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\") " pod="openstack/dnsmasq-dns-6d66f584d7-fvhgj" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.901448 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqggl\" (UniqueName: \"kubernetes.io/projected/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-kube-api-access-mqggl\") pod \"dnsmasq-dns-6d66f584d7-fvhgj\" (UID: \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\") " pod="openstack/dnsmasq-dns-6d66f584d7-fvhgj" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.901481 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-ovsdbserver-nb\") pod \"dnsmasq-dns-6d66f584d7-fvhgj\" (UID: \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\") " pod="openstack/dnsmasq-dns-6d66f584d7-fvhgj" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.901505 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9rcs\" (UniqueName: \"kubernetes.io/projected/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-kube-api-access-t9rcs\") pod \"barbican-api-7c45496686-rqmqg\" (UID: \"8adf3e3f-bdc8-4a02-90da-71e21926d9f9\") " pod="openstack/barbican-api-7c45496686-rqmqg" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.901524 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-combined-ca-bundle\") pod \"barbican-api-7c45496686-rqmqg\" (UID: \"8adf3e3f-bdc8-4a02-90da-71e21926d9f9\") " pod="openstack/barbican-api-7c45496686-rqmqg" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.901546 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-logs\") pod \"barbican-api-7c45496686-rqmqg\" (UID: \"8adf3e3f-bdc8-4a02-90da-71e21926d9f9\") " pod="openstack/barbican-api-7c45496686-rqmqg" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.901614 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-dns-swift-storage-0\") pod \"dnsmasq-dns-6d66f584d7-fvhgj\" (UID: \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\") " pod="openstack/dnsmasq-dns-6d66f584d7-fvhgj" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.902896 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-ovsdbserver-sb\") pod \"dnsmasq-dns-6d66f584d7-fvhgj\" (UID: \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\") " pod="openstack/dnsmasq-dns-6d66f584d7-fvhgj" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.902915 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-config\") pod \"dnsmasq-dns-6d66f584d7-fvhgj\" (UID: \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\") " pod="openstack/dnsmasq-dns-6d66f584d7-fvhgj" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.902958 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-ovsdbserver-nb\") pod \"dnsmasq-dns-6d66f584d7-fvhgj\" (UID: \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\") " pod="openstack/dnsmasq-dns-6d66f584d7-fvhgj" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.903172 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-dns-svc\") pod \"dnsmasq-dns-6d66f584d7-fvhgj\" (UID: \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\") " pod="openstack/dnsmasq-dns-6d66f584d7-fvhgj" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.903180 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-dns-swift-storage-0\") pod \"dnsmasq-dns-6d66f584d7-fvhgj\" (UID: \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\") " pod="openstack/dnsmasq-dns-6d66f584d7-fvhgj" Sep 30 07:50:54 crc kubenswrapper[4760]: I0930 07:50:54.924384 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqggl\" (UniqueName: \"kubernetes.io/projected/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-kube-api-access-mqggl\") pod \"dnsmasq-dns-6d66f584d7-fvhgj\" (UID: \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\") " pod="openstack/dnsmasq-dns-6d66f584d7-fvhgj" Sep 30 07:50:55 crc kubenswrapper[4760]: I0930 07:50:55.004317 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-config-data-custom\") pod \"barbican-api-7c45496686-rqmqg\" (UID: \"8adf3e3f-bdc8-4a02-90da-71e21926d9f9\") " pod="openstack/barbican-api-7c45496686-rqmqg" Sep 30 07:50:55 crc kubenswrapper[4760]: I0930 07:50:55.004368 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-config-data\") pod \"barbican-api-7c45496686-rqmqg\" (UID: \"8adf3e3f-bdc8-4a02-90da-71e21926d9f9\") " pod="openstack/barbican-api-7c45496686-rqmqg" Sep 30 07:50:55 crc kubenswrapper[4760]: I0930 07:50:55.004454 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9rcs\" (UniqueName: \"kubernetes.io/projected/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-kube-api-access-t9rcs\") pod \"barbican-api-7c45496686-rqmqg\" (UID: \"8adf3e3f-bdc8-4a02-90da-71e21926d9f9\") " pod="openstack/barbican-api-7c45496686-rqmqg" Sep 30 07:50:55 crc kubenswrapper[4760]: I0930 07:50:55.004476 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-combined-ca-bundle\") pod \"barbican-api-7c45496686-rqmqg\" (UID: \"8adf3e3f-bdc8-4a02-90da-71e21926d9f9\") " pod="openstack/barbican-api-7c45496686-rqmqg" Sep 30 07:50:55 crc kubenswrapper[4760]: I0930 07:50:55.004501 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-logs\") pod \"barbican-api-7c45496686-rqmqg\" (UID: \"8adf3e3f-bdc8-4a02-90da-71e21926d9f9\") " pod="openstack/barbican-api-7c45496686-rqmqg" Sep 30 07:50:55 crc kubenswrapper[4760]: I0930 07:50:55.005239 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-logs\") pod \"barbican-api-7c45496686-rqmqg\" (UID: \"8adf3e3f-bdc8-4a02-90da-71e21926d9f9\") " pod="openstack/barbican-api-7c45496686-rqmqg" Sep 30 07:50:55 crc kubenswrapper[4760]: I0930 07:50:55.008346 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-config-data-custom\") pod \"barbican-api-7c45496686-rqmqg\" (UID: \"8adf3e3f-bdc8-4a02-90da-71e21926d9f9\") " pod="openstack/barbican-api-7c45496686-rqmqg" Sep 30 07:50:55 crc kubenswrapper[4760]: I0930 07:50:55.009800 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-combined-ca-bundle\") pod \"barbican-api-7c45496686-rqmqg\" (UID: \"8adf3e3f-bdc8-4a02-90da-71e21926d9f9\") " pod="openstack/barbican-api-7c45496686-rqmqg" Sep 30 07:50:55 crc kubenswrapper[4760]: I0930 07:50:55.010270 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-config-data\") pod \"barbican-api-7c45496686-rqmqg\" (UID: \"8adf3e3f-bdc8-4a02-90da-71e21926d9f9\") " pod="openstack/barbican-api-7c45496686-rqmqg" Sep 30 07:50:55 crc kubenswrapper[4760]: I0930 07:50:55.030680 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9rcs\" (UniqueName: \"kubernetes.io/projected/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-kube-api-access-t9rcs\") pod \"barbican-api-7c45496686-rqmqg\" (UID: \"8adf3e3f-bdc8-4a02-90da-71e21926d9f9\") " pod="openstack/barbican-api-7c45496686-rqmqg" Sep 30 07:50:55 crc kubenswrapper[4760]: I0930 07:50:55.198719 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d66f584d7-fvhgj" Sep 30 07:50:55 crc kubenswrapper[4760]: I0930 07:50:55.210930 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c45496686-rqmqg" Sep 30 07:50:56 crc kubenswrapper[4760]: I0930 07:50:56.905772 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-65c7cb7cc8-cjqdk"] Sep 30 07:50:56 crc kubenswrapper[4760]: I0930 07:50:56.907844 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-65c7cb7cc8-cjqdk" Sep 30 07:50:56 crc kubenswrapper[4760]: I0930 07:50:56.917806 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Sep 30 07:50:56 crc kubenswrapper[4760]: I0930 07:50:56.951592 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Sep 30 07:50:56 crc kubenswrapper[4760]: I0930 07:50:56.972010 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-65c7cb7cc8-cjqdk"] Sep 30 07:50:57 crc kubenswrapper[4760]: I0930 07:50:57.054604 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lxlw\" (UniqueName: \"kubernetes.io/projected/59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3-kube-api-access-5lxlw\") pod \"barbican-api-65c7cb7cc8-cjqdk\" (UID: \"59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3\") " pod="openstack/barbican-api-65c7cb7cc8-cjqdk" Sep 30 07:50:57 crc kubenswrapper[4760]: I0930 07:50:57.054672 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3-public-tls-certs\") pod \"barbican-api-65c7cb7cc8-cjqdk\" (UID: \"59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3\") " pod="openstack/barbican-api-65c7cb7cc8-cjqdk" Sep 30 07:50:57 crc kubenswrapper[4760]: I0930 07:50:57.054707 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3-logs\") pod \"barbican-api-65c7cb7cc8-cjqdk\" (UID: \"59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3\") " pod="openstack/barbican-api-65c7cb7cc8-cjqdk" Sep 30 07:50:57 crc kubenswrapper[4760]: I0930 07:50:57.054776 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3-config-data\") pod \"barbican-api-65c7cb7cc8-cjqdk\" (UID: \"59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3\") " pod="openstack/barbican-api-65c7cb7cc8-cjqdk" Sep 30 07:50:57 crc kubenswrapper[4760]: I0930 07:50:57.054816 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3-internal-tls-certs\") pod \"barbican-api-65c7cb7cc8-cjqdk\" (UID: \"59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3\") " pod="openstack/barbican-api-65c7cb7cc8-cjqdk" Sep 30 07:50:57 crc kubenswrapper[4760]: I0930 07:50:57.054853 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3-combined-ca-bundle\") pod \"barbican-api-65c7cb7cc8-cjqdk\" (UID: \"59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3\") " pod="openstack/barbican-api-65c7cb7cc8-cjqdk" Sep 30 07:50:57 crc kubenswrapper[4760]: I0930 07:50:57.054879 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3-config-data-custom\") pod \"barbican-api-65c7cb7cc8-cjqdk\" (UID: \"59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3\") " pod="openstack/barbican-api-65c7cb7cc8-cjqdk" Sep 30 07:50:57 crc kubenswrapper[4760]: I0930 07:50:57.156441 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3-combined-ca-bundle\") pod \"barbican-api-65c7cb7cc8-cjqdk\" (UID: \"59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3\") " pod="openstack/barbican-api-65c7cb7cc8-cjqdk" Sep 30 07:50:57 crc kubenswrapper[4760]: I0930 07:50:57.156496 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3-config-data-custom\") pod \"barbican-api-65c7cb7cc8-cjqdk\" (UID: \"59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3\") " pod="openstack/barbican-api-65c7cb7cc8-cjqdk" Sep 30 07:50:57 crc kubenswrapper[4760]: I0930 07:50:57.156609 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lxlw\" (UniqueName: \"kubernetes.io/projected/59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3-kube-api-access-5lxlw\") pod \"barbican-api-65c7cb7cc8-cjqdk\" (UID: \"59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3\") " pod="openstack/barbican-api-65c7cb7cc8-cjqdk" Sep 30 07:50:57 crc kubenswrapper[4760]: I0930 07:50:57.156638 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3-public-tls-certs\") pod \"barbican-api-65c7cb7cc8-cjqdk\" (UID: \"59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3\") " pod="openstack/barbican-api-65c7cb7cc8-cjqdk" Sep 30 07:50:57 crc kubenswrapper[4760]: I0930 07:50:57.156664 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3-logs\") pod \"barbican-api-65c7cb7cc8-cjqdk\" (UID: \"59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3\") " pod="openstack/barbican-api-65c7cb7cc8-cjqdk" Sep 30 07:50:57 crc kubenswrapper[4760]: I0930 07:50:57.156729 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3-config-data\") pod \"barbican-api-65c7cb7cc8-cjqdk\" (UID: \"59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3\") " pod="openstack/barbican-api-65c7cb7cc8-cjqdk" Sep 30 07:50:57 crc kubenswrapper[4760]: I0930 07:50:57.156774 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3-internal-tls-certs\") pod \"barbican-api-65c7cb7cc8-cjqdk\" (UID: \"59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3\") " pod="openstack/barbican-api-65c7cb7cc8-cjqdk" Sep 30 07:50:57 crc kubenswrapper[4760]: I0930 07:50:57.159468 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3-logs\") pod \"barbican-api-65c7cb7cc8-cjqdk\" (UID: \"59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3\") " pod="openstack/barbican-api-65c7cb7cc8-cjqdk" Sep 30 07:50:57 crc kubenswrapper[4760]: I0930 07:50:57.175921 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3-internal-tls-certs\") pod \"barbican-api-65c7cb7cc8-cjqdk\" (UID: \"59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3\") " pod="openstack/barbican-api-65c7cb7cc8-cjqdk" Sep 30 07:50:57 crc kubenswrapper[4760]: I0930 07:50:57.176075 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3-config-data\") pod \"barbican-api-65c7cb7cc8-cjqdk\" (UID: \"59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3\") " pod="openstack/barbican-api-65c7cb7cc8-cjqdk" Sep 30 07:50:57 crc kubenswrapper[4760]: I0930 07:50:57.180861 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3-config-data-custom\") pod \"barbican-api-65c7cb7cc8-cjqdk\" (UID: \"59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3\") " pod="openstack/barbican-api-65c7cb7cc8-cjqdk" Sep 30 07:50:57 crc kubenswrapper[4760]: I0930 07:50:57.199089 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3-public-tls-certs\") pod \"barbican-api-65c7cb7cc8-cjqdk\" (UID: \"59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3\") " pod="openstack/barbican-api-65c7cb7cc8-cjqdk" Sep 30 07:50:57 crc kubenswrapper[4760]: I0930 07:50:57.199892 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3-combined-ca-bundle\") pod \"barbican-api-65c7cb7cc8-cjqdk\" (UID: \"59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3\") " pod="openstack/barbican-api-65c7cb7cc8-cjqdk" Sep 30 07:50:57 crc kubenswrapper[4760]: I0930 07:50:57.214006 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lxlw\" (UniqueName: \"kubernetes.io/projected/59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3-kube-api-access-5lxlw\") pod \"barbican-api-65c7cb7cc8-cjqdk\" (UID: \"59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3\") " pod="openstack/barbican-api-65c7cb7cc8-cjqdk" Sep 30 07:50:57 crc kubenswrapper[4760]: I0930 07:50:57.287774 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-65c7cb7cc8-cjqdk" Sep 30 07:50:59 crc kubenswrapper[4760]: I0930 07:50:59.370910 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="f1e2802c-ec39-48ee-a6c4-b609a57aa0c5" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.154:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 07:50:59 crc kubenswrapper[4760]: I0930 07:50:59.371727 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="f1e2802c-ec39-48ee-a6c4-b609a57aa0c5" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.154:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 07:50:59 crc kubenswrapper[4760]: I0930 07:50:59.466816 4760 generic.go:334] "Generic (PLEG): container finished" podID="108a4c03-5bd3-45d0-a13d-b67e01bd7654" containerID="34df7a6fed2b13c4f77c2412b04d3044f9f81e01a69fa17b070335180992cc7c" exitCode=0 Sep 30 07:50:59 crc kubenswrapper[4760]: I0930 07:50:59.466859 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8vvnb" event={"ID":"108a4c03-5bd3-45d0-a13d-b67e01bd7654","Type":"ContainerDied","Data":"34df7a6fed2b13c4f77c2412b04d3044f9f81e01a69fa17b070335180992cc7c"} Sep 30 07:51:01 crc kubenswrapper[4760]: I0930 07:51:01.048554 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" podUID="da7c2ace-093d-4279-bad0-5f2876f4ab8d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.149:5353: i/o timeout" Sep 30 07:51:02 crc kubenswrapper[4760]: I0930 07:51:02.889995 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" Sep 30 07:51:02 crc kubenswrapper[4760]: I0930 07:51:02.900192 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8vvnb" Sep 30 07:51:02 crc kubenswrapper[4760]: I0930 07:51:02.909616 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 07:51:02 crc kubenswrapper[4760]: I0930 07:51:02.982788 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq6fn\" (UniqueName: \"kubernetes.io/projected/108a4c03-5bd3-45d0-a13d-b67e01bd7654-kube-api-access-pq6fn\") pod \"108a4c03-5bd3-45d0-a13d-b67e01bd7654\" (UID: \"108a4c03-5bd3-45d0-a13d-b67e01bd7654\") " Sep 30 07:51:02 crc kubenswrapper[4760]: I0930 07:51:02.982911 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-ovsdbserver-sb\") pod \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\" (UID: \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\") " Sep 30 07:51:02 crc kubenswrapper[4760]: I0930 07:51:02.982940 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-logs\") pod \"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5\" (UID: \"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5\") " Sep 30 07:51:02 crc kubenswrapper[4760]: I0930 07:51:02.982980 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-combined-ca-bundle\") pod \"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5\" (UID: \"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5\") " Sep 30 07:51:02 crc kubenswrapper[4760]: I0930 07:51:02.983009 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltmzc\" (UniqueName: \"kubernetes.io/projected/da7c2ace-093d-4279-bad0-5f2876f4ab8d-kube-api-access-ltmzc\") pod \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\" (UID: \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\") " Sep 30 07:51:02 crc kubenswrapper[4760]: I0930 07:51:02.983058 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-config\") pod \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\" (UID: \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\") " Sep 30 07:51:02 crc kubenswrapper[4760]: I0930 07:51:02.983110 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/108a4c03-5bd3-45d0-a13d-b67e01bd7654-combined-ca-bundle\") pod \"108a4c03-5bd3-45d0-a13d-b67e01bd7654\" (UID: \"108a4c03-5bd3-45d0-a13d-b67e01bd7654\") " Sep 30 07:51:02 crc kubenswrapper[4760]: I0930 07:51:02.983157 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-custom-prometheus-ca\") pod \"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5\" (UID: \"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5\") " Sep 30 07:51:02 crc kubenswrapper[4760]: I0930 07:51:02.983194 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-dns-swift-storage-0\") pod \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\" (UID: \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\") " Sep 30 07:51:02 crc kubenswrapper[4760]: I0930 07:51:02.983215 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-ovsdbserver-nb\") pod \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\" (UID: \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\") " Sep 30 07:51:02 crc kubenswrapper[4760]: I0930 07:51:02.983270 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-config-data\") pod \"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5\" (UID: \"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5\") " Sep 30 07:51:02 crc kubenswrapper[4760]: I0930 07:51:02.983292 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6prkr\" (UniqueName: \"kubernetes.io/projected/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-kube-api-access-6prkr\") pod \"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5\" (UID: \"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5\") " Sep 30 07:51:02 crc kubenswrapper[4760]: I0930 07:51:02.983366 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-dns-svc\") pod \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\" (UID: \"da7c2ace-093d-4279-bad0-5f2876f4ab8d\") " Sep 30 07:51:02 crc kubenswrapper[4760]: I0930 07:51:02.983413 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/108a4c03-5bd3-45d0-a13d-b67e01bd7654-config\") pod \"108a4c03-5bd3-45d0-a13d-b67e01bd7654\" (UID: \"108a4c03-5bd3-45d0-a13d-b67e01bd7654\") " Sep 30 07:51:02 crc kubenswrapper[4760]: I0930 07:51:02.988045 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-logs" (OuterVolumeSpecName: "logs") pod "f1e2802c-ec39-48ee-a6c4-b609a57aa0c5" (UID: "f1e2802c-ec39-48ee-a6c4-b609a57aa0c5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:51:02 crc kubenswrapper[4760]: I0930 07:51:02.995527 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-kube-api-access-6prkr" (OuterVolumeSpecName: "kube-api-access-6prkr") pod "f1e2802c-ec39-48ee-a6c4-b609a57aa0c5" (UID: "f1e2802c-ec39-48ee-a6c4-b609a57aa0c5"). InnerVolumeSpecName "kube-api-access-6prkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:51:02 crc kubenswrapper[4760]: I0930 07:51:02.999646 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da7c2ace-093d-4279-bad0-5f2876f4ab8d-kube-api-access-ltmzc" (OuterVolumeSpecName: "kube-api-access-ltmzc") pod "da7c2ace-093d-4279-bad0-5f2876f4ab8d" (UID: "da7c2ace-093d-4279-bad0-5f2876f4ab8d"). InnerVolumeSpecName "kube-api-access-ltmzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.026543 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/108a4c03-5bd3-45d0-a13d-b67e01bd7654-kube-api-access-pq6fn" (OuterVolumeSpecName: "kube-api-access-pq6fn") pod "108a4c03-5bd3-45d0-a13d-b67e01bd7654" (UID: "108a4c03-5bd3-45d0-a13d-b67e01bd7654"). InnerVolumeSpecName "kube-api-access-pq6fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.044990 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/108a4c03-5bd3-45d0-a13d-b67e01bd7654-config" (OuterVolumeSpecName: "config") pod "108a4c03-5bd3-45d0-a13d-b67e01bd7654" (UID: "108a4c03-5bd3-45d0-a13d-b67e01bd7654"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.048171 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "f1e2802c-ec39-48ee-a6c4-b609a57aa0c5" (UID: "f1e2802c-ec39-48ee-a6c4-b609a57aa0c5"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.063434 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "da7c2ace-093d-4279-bad0-5f2876f4ab8d" (UID: "da7c2ace-093d-4279-bad0-5f2876f4ab8d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.072920 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "da7c2ace-093d-4279-bad0-5f2876f4ab8d" (UID: "da7c2ace-093d-4279-bad0-5f2876f4ab8d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.080798 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "da7c2ace-093d-4279-bad0-5f2876f4ab8d" (UID: "da7c2ace-093d-4279-bad0-5f2876f4ab8d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.088362 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-logs\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.088418 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltmzc\" (UniqueName: \"kubernetes.io/projected/da7c2ace-093d-4279-bad0-5f2876f4ab8d-kube-api-access-ltmzc\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.088429 4760 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.088440 4760 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.088454 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.088463 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6prkr\" (UniqueName: \"kubernetes.io/projected/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-kube-api-access-6prkr\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.088471 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.088481 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/108a4c03-5bd3-45d0-a13d-b67e01bd7654-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.088489 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq6fn\" (UniqueName: \"kubernetes.io/projected/108a4c03-5bd3-45d0-a13d-b67e01bd7654-kube-api-access-pq6fn\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.089874 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/108a4c03-5bd3-45d0-a13d-b67e01bd7654-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "108a4c03-5bd3-45d0-a13d-b67e01bd7654" (UID: "108a4c03-5bd3-45d0-a13d-b67e01bd7654"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.127826 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "da7c2ace-093d-4279-bad0-5f2876f4ab8d" (UID: "da7c2ace-093d-4279-bad0-5f2876f4ab8d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.131454 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1e2802c-ec39-48ee-a6c4-b609a57aa0c5" (UID: "f1e2802c-ec39-48ee-a6c4-b609a57aa0c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.143774 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-config" (OuterVolumeSpecName: "config") pod "da7c2ace-093d-4279-bad0-5f2876f4ab8d" (UID: "da7c2ace-093d-4279-bad0-5f2876f4ab8d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.148210 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-config-data" (OuterVolumeSpecName: "config-data") pod "f1e2802c-ec39-48ee-a6c4-b609a57aa0c5" (UID: "f1e2802c-ec39-48ee-a6c4-b609a57aa0c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.191320 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.191357 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.191371 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da7c2ace-093d-4279-bad0-5f2876f4ab8d-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.191383 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/108a4c03-5bd3-45d0-a13d-b67e01bd7654-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.191396 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.506223 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" event={"ID":"da7c2ace-093d-4279-bad0-5f2876f4ab8d","Type":"ContainerDied","Data":"23a9ea76aa5283912ad3c948cfd07d2ef8385b05a745b19385aff4ea80cbdb5f"} Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.506262 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.509731 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8vvnb" event={"ID":"108a4c03-5bd3-45d0-a13d-b67e01bd7654","Type":"ContainerDied","Data":"cf259968486ff404265fc3045cf68173ccb6c56c32ee9888c1ff8a40ae8a8525"} Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.509815 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf259968486ff404265fc3045cf68173ccb6c56c32ee9888c1ff8a40ae8a8525" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.509743 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8vvnb" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.514348 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"f1e2802c-ec39-48ee-a6c4-b609a57aa0c5","Type":"ContainerDied","Data":"56a489224c6b0fd5341421022afd3b3837a95b15aae7004d350de464f74d61a2"} Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.514404 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.553238 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-j5p4t"] Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.564337 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-j5p4t"] Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.574412 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.583770 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.591459 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Sep 30 07:51:03 crc kubenswrapper[4760]: E0930 07:51:03.591906 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1e2802c-ec39-48ee-a6c4-b609a57aa0c5" containerName="watcher-api-log" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.591926 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1e2802c-ec39-48ee-a6c4-b609a57aa0c5" containerName="watcher-api-log" Sep 30 07:51:03 crc kubenswrapper[4760]: E0930 07:51:03.591949 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da7c2ace-093d-4279-bad0-5f2876f4ab8d" containerName="dnsmasq-dns" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.591956 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="da7c2ace-093d-4279-bad0-5f2876f4ab8d" containerName="dnsmasq-dns" Sep 30 07:51:03 crc kubenswrapper[4760]: E0930 07:51:03.591979 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1e2802c-ec39-48ee-a6c4-b609a57aa0c5" containerName="watcher-api" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.591985 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1e2802c-ec39-48ee-a6c4-b609a57aa0c5" containerName="watcher-api" Sep 30 07:51:03 crc kubenswrapper[4760]: E0930 07:51:03.591997 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da7c2ace-093d-4279-bad0-5f2876f4ab8d" containerName="init" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.592003 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="da7c2ace-093d-4279-bad0-5f2876f4ab8d" containerName="init" Sep 30 07:51:03 crc kubenswrapper[4760]: E0930 07:51:03.592016 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="108a4c03-5bd3-45d0-a13d-b67e01bd7654" containerName="neutron-db-sync" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.592022 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="108a4c03-5bd3-45d0-a13d-b67e01bd7654" containerName="neutron-db-sync" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.592190 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1e2802c-ec39-48ee-a6c4-b609a57aa0c5" containerName="watcher-api-log" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.592202 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="da7c2ace-093d-4279-bad0-5f2876f4ab8d" containerName="dnsmasq-dns" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.592215 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="108a4c03-5bd3-45d0-a13d-b67e01bd7654" containerName="neutron-db-sync" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.592227 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1e2802c-ec39-48ee-a6c4-b609a57aa0c5" containerName="watcher-api" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.593253 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.595913 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.596104 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.596291 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.601017 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 07:51:03 crc kubenswrapper[4760]: W0930 07:51:03.668780 4760 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1df67641_4598_4ba5_a59a_a195084e5446.slice/crio-b441e987ef7decf049664c413470ea11bd7c6b9be8c56175022120bb2f98ef6f.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1df67641_4598_4ba5_a59a_a195084e5446.slice/crio-b441e987ef7decf049664c413470ea11bd7c6b9be8c56175022120bb2f98ef6f.scope: no such file or directory Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.715712 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79c24341-f615-4dcf-818f-e1c398e2504d-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"79c24341-f615-4dcf-818f-e1c398e2504d\") " pod="openstack/watcher-api-0" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.715752 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79c24341-f615-4dcf-818f-e1c398e2504d-config-data\") pod \"watcher-api-0\" (UID: \"79c24341-f615-4dcf-818f-e1c398e2504d\") " pod="openstack/watcher-api-0" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.715783 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79c24341-f615-4dcf-818f-e1c398e2504d-public-tls-certs\") pod \"watcher-api-0\" (UID: \"79c24341-f615-4dcf-818f-e1c398e2504d\") " pod="openstack/watcher-api-0" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.715854 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79c24341-f615-4dcf-818f-e1c398e2504d-logs\") pod \"watcher-api-0\" (UID: \"79c24341-f615-4dcf-818f-e1c398e2504d\") " pod="openstack/watcher-api-0" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.715887 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/79c24341-f615-4dcf-818f-e1c398e2504d-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"79c24341-f615-4dcf-818f-e1c398e2504d\") " pod="openstack/watcher-api-0" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.715955 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79c24341-f615-4dcf-818f-e1c398e2504d-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"79c24341-f615-4dcf-818f-e1c398e2504d\") " pod="openstack/watcher-api-0" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.716018 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n7xf\" (UniqueName: \"kubernetes.io/projected/79c24341-f615-4dcf-818f-e1c398e2504d-kube-api-access-6n7xf\") pod \"watcher-api-0\" (UID: \"79c24341-f615-4dcf-818f-e1c398e2504d\") " pod="openstack/watcher-api-0" Sep 30 07:51:03 crc kubenswrapper[4760]: W0930 07:51:03.772098 4760 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc588f954_a6b5_413f_bcc8_dbfe4c660d1b.slice/crio-ef9d2be6e5aa8cb925213a3b56f37367a3e3ce662a468e00b2da2bda3a7274f7": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc588f954_a6b5_413f_bcc8_dbfe4c660d1b.slice/crio-ef9d2be6e5aa8cb925213a3b56f37367a3e3ce662a468e00b2da2bda3a7274f7: no such file or directory Sep 30 07:51:03 crc kubenswrapper[4760]: W0930 07:51:03.772140 4760 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a3b79f1_3bbe_4418_9a81_4190a602dd5d.slice/crio-conmon-0c02b923b1bc6811bca16a8db1ba05951c985fbaa73046709d6d23369e97b202.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a3b79f1_3bbe_4418_9a81_4190a602dd5d.slice/crio-conmon-0c02b923b1bc6811bca16a8db1ba05951c985fbaa73046709d6d23369e97b202.scope: no such file or directory Sep 30 07:51:03 crc kubenswrapper[4760]: W0930 07:51:03.772154 4760 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a3b79f1_3bbe_4418_9a81_4190a602dd5d.slice/crio-0c02b923b1bc6811bca16a8db1ba05951c985fbaa73046709d6d23369e97b202.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a3b79f1_3bbe_4418_9a81_4190a602dd5d.slice/crio-0c02b923b1bc6811bca16a8db1ba05951c985fbaa73046709d6d23369e97b202.scope: no such file or directory Sep 30 07:51:03 crc kubenswrapper[4760]: W0930 07:51:03.776180 4760 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc588f954_a6b5_413f_bcc8_dbfe4c660d1b.slice/crio-conmon-1b7a1cd447b1a1865709f8a7edfad76422b4f9a2d7fee9eb718cfc11a99f68ce.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc588f954_a6b5_413f_bcc8_dbfe4c660d1b.slice/crio-conmon-1b7a1cd447b1a1865709f8a7edfad76422b4f9a2d7fee9eb718cfc11a99f68ce.scope: no such file or directory Sep 30 07:51:03 crc kubenswrapper[4760]: W0930 07:51:03.776223 4760 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc588f954_a6b5_413f_bcc8_dbfe4c660d1b.slice/crio-1b7a1cd447b1a1865709f8a7edfad76422b4f9a2d7fee9eb718cfc11a99f68ce.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc588f954_a6b5_413f_bcc8_dbfe4c660d1b.slice/crio-1b7a1cd447b1a1865709f8a7edfad76422b4f9a2d7fee9eb718cfc11a99f68ce.scope: no such file or directory Sep 30 07:51:03 crc kubenswrapper[4760]: W0930 07:51:03.776238 4760 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a3b79f1_3bbe_4418_9a81_4190a602dd5d.slice/crio-conmon-1e8e3e79719309f34e6554e6c8f0f765444bd702081e37346359ef0cce60c306.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a3b79f1_3bbe_4418_9a81_4190a602dd5d.slice/crio-conmon-1e8e3e79719309f34e6554e6c8f0f765444bd702081e37346359ef0cce60c306.scope: no such file or directory Sep 30 07:51:03 crc kubenswrapper[4760]: W0930 07:51:03.776253 4760 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a3b79f1_3bbe_4418_9a81_4190a602dd5d.slice/crio-1e8e3e79719309f34e6554e6c8f0f765444bd702081e37346359ef0cce60c306.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a3b79f1_3bbe_4418_9a81_4190a602dd5d.slice/crio-1e8e3e79719309f34e6554e6c8f0f765444bd702081e37346359ef0cce60c306.scope: no such file or directory Sep 30 07:51:03 crc kubenswrapper[4760]: W0930 07:51:03.776268 4760 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc588f954_a6b5_413f_bcc8_dbfe4c660d1b.slice/crio-conmon-093cefb45788d10e4b3981293660f4ed11758f770edd6a8e4b4ad49f359f4aa9.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc588f954_a6b5_413f_bcc8_dbfe4c660d1b.slice/crio-conmon-093cefb45788d10e4b3981293660f4ed11758f770edd6a8e4b4ad49f359f4aa9.scope: no such file or directory Sep 30 07:51:03 crc kubenswrapper[4760]: W0930 07:51:03.776283 4760 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc588f954_a6b5_413f_bcc8_dbfe4c660d1b.slice/crio-093cefb45788d10e4b3981293660f4ed11758f770edd6a8e4b4ad49f359f4aa9.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc588f954_a6b5_413f_bcc8_dbfe4c660d1b.slice/crio-093cefb45788d10e4b3981293660f4ed11758f770edd6a8e4b4ad49f359f4aa9.scope: no such file or directory Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.818822 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79c24341-f615-4dcf-818f-e1c398e2504d-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"79c24341-f615-4dcf-818f-e1c398e2504d\") " pod="openstack/watcher-api-0" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.818900 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n7xf\" (UniqueName: \"kubernetes.io/projected/79c24341-f615-4dcf-818f-e1c398e2504d-kube-api-access-6n7xf\") pod \"watcher-api-0\" (UID: \"79c24341-f615-4dcf-818f-e1c398e2504d\") " pod="openstack/watcher-api-0" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.818979 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79c24341-f615-4dcf-818f-e1c398e2504d-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"79c24341-f615-4dcf-818f-e1c398e2504d\") " pod="openstack/watcher-api-0" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.818998 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79c24341-f615-4dcf-818f-e1c398e2504d-config-data\") pod \"watcher-api-0\" (UID: \"79c24341-f615-4dcf-818f-e1c398e2504d\") " pod="openstack/watcher-api-0" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.819024 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79c24341-f615-4dcf-818f-e1c398e2504d-public-tls-certs\") pod \"watcher-api-0\" (UID: \"79c24341-f615-4dcf-818f-e1c398e2504d\") " pod="openstack/watcher-api-0" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.819047 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79c24341-f615-4dcf-818f-e1c398e2504d-logs\") pod \"watcher-api-0\" (UID: \"79c24341-f615-4dcf-818f-e1c398e2504d\") " pod="openstack/watcher-api-0" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.819073 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/79c24341-f615-4dcf-818f-e1c398e2504d-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"79c24341-f615-4dcf-818f-e1c398e2504d\") " pod="openstack/watcher-api-0" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.820831 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79c24341-f615-4dcf-818f-e1c398e2504d-logs\") pod \"watcher-api-0\" (UID: \"79c24341-f615-4dcf-818f-e1c398e2504d\") " pod="openstack/watcher-api-0" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.824065 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/79c24341-f615-4dcf-818f-e1c398e2504d-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"79c24341-f615-4dcf-818f-e1c398e2504d\") " pod="openstack/watcher-api-0" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.824847 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79c24341-f615-4dcf-818f-e1c398e2504d-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"79c24341-f615-4dcf-818f-e1c398e2504d\") " pod="openstack/watcher-api-0" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.825628 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79c24341-f615-4dcf-818f-e1c398e2504d-config-data\") pod \"watcher-api-0\" (UID: \"79c24341-f615-4dcf-818f-e1c398e2504d\") " pod="openstack/watcher-api-0" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.826795 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79c24341-f615-4dcf-818f-e1c398e2504d-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"79c24341-f615-4dcf-818f-e1c398e2504d\") " pod="openstack/watcher-api-0" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.828354 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79c24341-f615-4dcf-818f-e1c398e2504d-public-tls-certs\") pod \"watcher-api-0\" (UID: \"79c24341-f615-4dcf-818f-e1c398e2504d\") " pod="openstack/watcher-api-0" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.836809 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n7xf\" (UniqueName: \"kubernetes.io/projected/79c24341-f615-4dcf-818f-e1c398e2504d-kube-api-access-6n7xf\") pod \"watcher-api-0\" (UID: \"79c24341-f615-4dcf-818f-e1c398e2504d\") " pod="openstack/watcher-api-0" Sep 30 07:51:03 crc kubenswrapper[4760]: I0930 07:51:03.928262 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 07:51:04 crc kubenswrapper[4760]: E0930 07:51:04.038966 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod108a4c03_5bd3_45d0_a13d_b67e01bd7654.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod108a4c03_5bd3_45d0_a13d_b67e01bd7654.slice/crio-conmon-34df7a6fed2b13c4f77c2412b04d3044f9f81e01a69fa17b070335180992cc7c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d907232_c03b_44a4_a0b5_36ce5bf6d62b.slice/crio-31ed2dc075a4352931d059eb24b16e55e055f8e5fe67745ba3017d7bc673416e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1e2802c_ec39_48ee_a6c4_b609a57aa0c5.slice/crio-56a489224c6b0fd5341421022afd3b3837a95b15aae7004d350de464f74d61a2\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7668a34_2cd6_4873_a55a_27839a612a2b.slice/crio-7da8cf7bc3a971e417b13433c8641faa22c49b2b1b7fd3507892848e0fb76724.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d907232_c03b_44a4_a0b5_36ce5bf6d62b.slice/crio-conmon-2e8e9f85e632b06032bb32e36e15e521539ad1bfeea1fcaa3c7fb720729d78cf.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1e2802c_ec39_48ee_a6c4_b609a57aa0c5.slice/crio-8bb4ae2b4cdc4f4f073c4d86092669e02429c7f687807aac45ca0ee0b9dbe8d5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod108a4c03_5bd3_45d0_a13d_b67e01bd7654.slice/crio-34df7a6fed2b13c4f77c2412b04d3044f9f81e01a69fa17b070335180992cc7c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8f42596_0ba9_41d9_a780_b8b3705b963c.slice/crio-conmon-0bdb08b78789e281874d3f19555926dccd223d02905679bb3a399172af61108a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1df67641_4598_4ba5_a59a_a195084e5446.slice/crio-conmon-b441e987ef7decf049664c413470ea11bd7c6b9be8c56175022120bb2f98ef6f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda7c2ace_093d_4279_bad0_5f2876f4ab8d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1e2802c_ec39_48ee_a6c4_b609a57aa0c5.slice/crio-conmon-8bb4ae2b4cdc4f4f073c4d86092669e02429c7f687807aac45ca0ee0b9dbe8d5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1e2802c_ec39_48ee_a6c4_b609a57aa0c5.slice/crio-conmon-f0aeb72fd3b5bc55a798002692855ffad1e3d949c0cd84e5255f5d2652bc2d77.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda7c2ace_093d_4279_bad0_5f2876f4ab8d.slice/crio-bcf47ca8284d148fa2358a3feb62e90bc3ea5d9a074723dd52bfb40d5a484182.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7668a34_2cd6_4873_a55a_27839a612a2b.slice/crio-conmon-7da8cf7bc3a971e417b13433c8641faa22c49b2b1b7fd3507892848e0fb76724.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda7c2ace_093d_4279_bad0_5f2876f4ab8d.slice/crio-conmon-bcf47ca8284d148fa2358a3feb62e90bc3ea5d9a074723dd52bfb40d5a484182.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1e2802c_ec39_48ee_a6c4_b609a57aa0c5.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod108a4c03_5bd3_45d0_a13d_b67e01bd7654.slice/crio-cf259968486ff404265fc3045cf68173ccb6c56c32ee9888c1ff8a40ae8a8525\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7668a34_2cd6_4873_a55a_27839a612a2b.slice/crio-conmon-c9f51e05fcba75797df060a5a1ac980cc089f7cde31e72b0efb1fc5117351629.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8f42596_0ba9_41d9_a780_b8b3705b963c.slice/crio-0bdb08b78789e281874d3f19555926dccd223d02905679bb3a399172af61108a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7668a34_2cd6_4873_a55a_27839a612a2b.slice/crio-c9f51e05fcba75797df060a5a1ac980cc089f7cde31e72b0efb1fc5117351629.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d907232_c03b_44a4_a0b5_36ce5bf6d62b.slice/crio-conmon-31ed2dc075a4352931d059eb24b16e55e055f8e5fe67745ba3017d7bc673416e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d907232_c03b_44a4_a0b5_36ce5bf6d62b.slice/crio-2e8e9f85e632b06032bb32e36e15e521539ad1bfeea1fcaa3c7fb720729d78cf.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1df67641_4598_4ba5_a59a_a195084e5446.slice/crio-7b95bf520c48586f4386da3ad833eb819f74f25e4d61c4e176a5cc33f365f080\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1df67641_4598_4ba5_a59a_a195084e5446.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a3b79f1_3bbe_4418_9a81_4190a602dd5d.slice/crio-ed00058a5fde843bca2ea5750acc9a0c793259cd6244a13d4bf73c540bddf156\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a3b79f1_3bbe_4418_9a81_4190a602dd5d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1e2802c_ec39_48ee_a6c4_b609a57aa0c5.slice/crio-f0aeb72fd3b5bc55a798002692855ffad1e3d949c0cd84e5255f5d2652bc2d77.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda7c2ace_093d_4279_bad0_5f2876f4ab8d.slice/crio-23a9ea76aa5283912ad3c948cfd07d2ef8385b05a745b19385aff4ea80cbdb5f\": RecentStats: unable to find data in memory cache]" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.191530 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-fvhgj"] Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.237447 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-2v8mj"] Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.250410 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.269892 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-2v8mj"] Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.298502 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-79bccb96b8-8wjx5"] Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.300610 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79bccb96b8-8wjx5" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.305023 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.305479 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.305613 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-mscbd" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.306213 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.334623 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79bccb96b8-8wjx5"] Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.351621 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-config\") pod \"dnsmasq-dns-688c87cc99-2v8mj\" (UID: \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\") " pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.351806 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-2v8mj\" (UID: \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\") " pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.352139 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz8sl\" (UniqueName: \"kubernetes.io/projected/c3f46a87-7a28-45d7-ad4d-98b0ce508557-kube-api-access-lz8sl\") pod \"dnsmasq-dns-688c87cc99-2v8mj\" (UID: \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\") " pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.352276 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-2v8mj\" (UID: \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\") " pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.352338 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-2v8mj\" (UID: \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\") " pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.352357 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-dns-svc\") pod \"dnsmasq-dns-688c87cc99-2v8mj\" (UID: \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\") " pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.371895 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="f1e2802c-ec39-48ee-a6c4-b609a57aa0c5" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.154:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.371892 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="f1e2802c-ec39-48ee-a6c4-b609a57aa0c5" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.154:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.454196 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ec89e4-64de-42eb-959a-c064d716f0f3-combined-ca-bundle\") pod \"neutron-79bccb96b8-8wjx5\" (UID: \"f3ec89e4-64de-42eb-959a-c064d716f0f3\") " pod="openstack/neutron-79bccb96b8-8wjx5" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.454280 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-2v8mj\" (UID: \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\") " pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.454324 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f3ec89e4-64de-42eb-959a-c064d716f0f3-httpd-config\") pod \"neutron-79bccb96b8-8wjx5\" (UID: \"f3ec89e4-64de-42eb-959a-c064d716f0f3\") " pod="openstack/neutron-79bccb96b8-8wjx5" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.454389 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-2v8mj\" (UID: \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\") " pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.454418 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-dns-svc\") pod \"dnsmasq-dns-688c87cc99-2v8mj\" (UID: \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\") " pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.454467 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-config\") pod \"dnsmasq-dns-688c87cc99-2v8mj\" (UID: \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\") " pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.454488 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3ec89e4-64de-42eb-959a-c064d716f0f3-ovndb-tls-certs\") pod \"neutron-79bccb96b8-8wjx5\" (UID: \"f3ec89e4-64de-42eb-959a-c064d716f0f3\") " pod="openstack/neutron-79bccb96b8-8wjx5" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.454561 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-2v8mj\" (UID: \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\") " pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.454602 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3ec89e4-64de-42eb-959a-c064d716f0f3-config\") pod \"neutron-79bccb96b8-8wjx5\" (UID: \"f3ec89e4-64de-42eb-959a-c064d716f0f3\") " pod="openstack/neutron-79bccb96b8-8wjx5" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.454655 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crjt5\" (UniqueName: \"kubernetes.io/projected/f3ec89e4-64de-42eb-959a-c064d716f0f3-kube-api-access-crjt5\") pod \"neutron-79bccb96b8-8wjx5\" (UID: \"f3ec89e4-64de-42eb-959a-c064d716f0f3\") " pod="openstack/neutron-79bccb96b8-8wjx5" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.454756 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz8sl\" (UniqueName: \"kubernetes.io/projected/c3f46a87-7a28-45d7-ad4d-98b0ce508557-kube-api-access-lz8sl\") pod \"dnsmasq-dns-688c87cc99-2v8mj\" (UID: \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\") " pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.456150 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-2v8mj\" (UID: \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\") " pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.456838 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-2v8mj\" (UID: \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\") " pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.457027 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-config\") pod \"dnsmasq-dns-688c87cc99-2v8mj\" (UID: \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\") " pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.457039 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-dns-svc\") pod \"dnsmasq-dns-688c87cc99-2v8mj\" (UID: \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\") " pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.457190 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-2v8mj\" (UID: \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\") " pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.488986 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz8sl\" (UniqueName: \"kubernetes.io/projected/c3f46a87-7a28-45d7-ad4d-98b0ce508557-kube-api-access-lz8sl\") pod \"dnsmasq-dns-688c87cc99-2v8mj\" (UID: \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\") " pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.503913 4760 scope.go:117] "RemoveContainer" containerID="bcf47ca8284d148fa2358a3feb62e90bc3ea5d9a074723dd52bfb40d5a484182" Sep 30 07:51:04 crc kubenswrapper[4760]: E0930 07:51:04.531372 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Sep 30 07:51:04 crc kubenswrapper[4760]: E0930 07:51:04.531581 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5tz96,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-9ml2s_openstack(4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 07:51:04 crc kubenswrapper[4760]: E0930 07:51:04.532718 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-9ml2s" podUID="4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.538726 4760 generic.go:334] "Generic (PLEG): container finished" podID="b7668a34-2cd6-4873-a55a-27839a612a2b" containerID="c9f51e05fcba75797df060a5a1ac980cc089f7cde31e72b0efb1fc5117351629" exitCode=137 Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.538766 4760 generic.go:334] "Generic (PLEG): container finished" podID="b7668a34-2cd6-4873-a55a-27839a612a2b" containerID="7da8cf7bc3a971e417b13433c8641faa22c49b2b1b7fd3507892848e0fb76724" exitCode=137 Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.538845 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d547648b9-kxd9p" event={"ID":"b7668a34-2cd6-4873-a55a-27839a612a2b","Type":"ContainerDied","Data":"c9f51e05fcba75797df060a5a1ac980cc089f7cde31e72b0efb1fc5117351629"} Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.538874 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d547648b9-kxd9p" event={"ID":"b7668a34-2cd6-4873-a55a-27839a612a2b","Type":"ContainerDied","Data":"7da8cf7bc3a971e417b13433c8641faa22c49b2b1b7fd3507892848e0fb76724"} Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.541652 4760 generic.go:334] "Generic (PLEG): container finished" podID="0d907232-c03b-44a4-a0b5-36ce5bf6d62b" containerID="31ed2dc075a4352931d059eb24b16e55e055f8e5fe67745ba3017d7bc673416e" exitCode=137 Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.541764 4760 generic.go:334] "Generic (PLEG): container finished" podID="0d907232-c03b-44a4-a0b5-36ce5bf6d62b" containerID="2e8e9f85e632b06032bb32e36e15e521539ad1bfeea1fcaa3c7fb720729d78cf" exitCode=137 Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.541876 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5885b885f5-7v6mw" event={"ID":"0d907232-c03b-44a4-a0b5-36ce5bf6d62b","Type":"ContainerDied","Data":"31ed2dc075a4352931d059eb24b16e55e055f8e5fe67745ba3017d7bc673416e"} Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.541954 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5885b885f5-7v6mw" event={"ID":"0d907232-c03b-44a4-a0b5-36ce5bf6d62b","Type":"ContainerDied","Data":"2e8e9f85e632b06032bb32e36e15e521539ad1bfeea1fcaa3c7fb720729d78cf"} Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.543559 4760 generic.go:334] "Generic (PLEG): container finished" podID="f8f42596-0ba9-41d9-a780-b8b3705b963c" containerID="0bdb08b78789e281874d3f19555926dccd223d02905679bb3a399172af61108a" exitCode=137 Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.543662 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-746f884dcc-8lkjw" event={"ID":"f8f42596-0ba9-41d9-a780-b8b3705b963c","Type":"ContainerDied","Data":"0bdb08b78789e281874d3f19555926dccd223d02905679bb3a399172af61108a"} Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.556914 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3ec89e4-64de-42eb-959a-c064d716f0f3-config\") pod \"neutron-79bccb96b8-8wjx5\" (UID: \"f3ec89e4-64de-42eb-959a-c064d716f0f3\") " pod="openstack/neutron-79bccb96b8-8wjx5" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.557679 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crjt5\" (UniqueName: \"kubernetes.io/projected/f3ec89e4-64de-42eb-959a-c064d716f0f3-kube-api-access-crjt5\") pod \"neutron-79bccb96b8-8wjx5\" (UID: \"f3ec89e4-64de-42eb-959a-c064d716f0f3\") " pod="openstack/neutron-79bccb96b8-8wjx5" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.557874 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ec89e4-64de-42eb-959a-c064d716f0f3-combined-ca-bundle\") pod \"neutron-79bccb96b8-8wjx5\" (UID: \"f3ec89e4-64de-42eb-959a-c064d716f0f3\") " pod="openstack/neutron-79bccb96b8-8wjx5" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.558011 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f3ec89e4-64de-42eb-959a-c064d716f0f3-httpd-config\") pod \"neutron-79bccb96b8-8wjx5\" (UID: \"f3ec89e4-64de-42eb-959a-c064d716f0f3\") " pod="openstack/neutron-79bccb96b8-8wjx5" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.558141 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3ec89e4-64de-42eb-959a-c064d716f0f3-ovndb-tls-certs\") pod \"neutron-79bccb96b8-8wjx5\" (UID: \"f3ec89e4-64de-42eb-959a-c064d716f0f3\") " pod="openstack/neutron-79bccb96b8-8wjx5" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.561644 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3ec89e4-64de-42eb-959a-c064d716f0f3-config\") pod \"neutron-79bccb96b8-8wjx5\" (UID: \"f3ec89e4-64de-42eb-959a-c064d716f0f3\") " pod="openstack/neutron-79bccb96b8-8wjx5" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.564824 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3ec89e4-64de-42eb-959a-c064d716f0f3-ovndb-tls-certs\") pod \"neutron-79bccb96b8-8wjx5\" (UID: \"f3ec89e4-64de-42eb-959a-c064d716f0f3\") " pod="openstack/neutron-79bccb96b8-8wjx5" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.565982 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f3ec89e4-64de-42eb-959a-c064d716f0f3-httpd-config\") pod \"neutron-79bccb96b8-8wjx5\" (UID: \"f3ec89e4-64de-42eb-959a-c064d716f0f3\") " pod="openstack/neutron-79bccb96b8-8wjx5" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.566771 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ec89e4-64de-42eb-959a-c064d716f0f3-combined-ca-bundle\") pod \"neutron-79bccb96b8-8wjx5\" (UID: \"f3ec89e4-64de-42eb-959a-c064d716f0f3\") " pod="openstack/neutron-79bccb96b8-8wjx5" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.576433 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crjt5\" (UniqueName: \"kubernetes.io/projected/f3ec89e4-64de-42eb-959a-c064d716f0f3-kube-api-access-crjt5\") pod \"neutron-79bccb96b8-8wjx5\" (UID: \"f3ec89e4-64de-42eb-959a-c064d716f0f3\") " pod="openstack/neutron-79bccb96b8-8wjx5" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.610811 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.625382 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79bccb96b8-8wjx5" Sep 30 07:51:04 crc kubenswrapper[4760]: I0930 07:51:04.796521 4760 scope.go:117] "RemoveContainer" containerID="a5ff1c069a10d6c66f905281b77d71e51dbec9fdaea6b7f99e4c9703dc9cfce2" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.049838 4760 scope.go:117] "RemoveContainer" containerID="f0aeb72fd3b5bc55a798002692855ffad1e3d949c0cd84e5255f5d2652bc2d77" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.091591 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da7c2ace-093d-4279-bad0-5f2876f4ab8d" path="/var/lib/kubelet/pods/da7c2ace-093d-4279-bad0-5f2876f4ab8d/volumes" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.094079 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1e2802c-ec39-48ee-a6c4-b609a57aa0c5" path="/var/lib/kubelet/pods/f1e2802c-ec39-48ee-a6c4-b609a57aa0c5/volumes" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.115111 4760 scope.go:117] "RemoveContainer" containerID="8bb4ae2b4cdc4f4f073c4d86092669e02429c7f687807aac45ca0ee0b9dbe8d5" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.235085 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-746f884dcc-8lkjw" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.385984 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f8f42596-0ba9-41d9-a780-b8b3705b963c-horizon-secret-key\") pod \"f8f42596-0ba9-41d9-a780-b8b3705b963c\" (UID: \"f8f42596-0ba9-41d9-a780-b8b3705b963c\") " Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.386088 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8f42596-0ba9-41d9-a780-b8b3705b963c-logs\") pod \"f8f42596-0ba9-41d9-a780-b8b3705b963c\" (UID: \"f8f42596-0ba9-41d9-a780-b8b3705b963c\") " Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.386117 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8f42596-0ba9-41d9-a780-b8b3705b963c-scripts\") pod \"f8f42596-0ba9-41d9-a780-b8b3705b963c\" (UID: \"f8f42596-0ba9-41d9-a780-b8b3705b963c\") " Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.386185 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8f42596-0ba9-41d9-a780-b8b3705b963c-config-data\") pod \"f8f42596-0ba9-41d9-a780-b8b3705b963c\" (UID: \"f8f42596-0ba9-41d9-a780-b8b3705b963c\") " Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.386253 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9g8dc\" (UniqueName: \"kubernetes.io/projected/f8f42596-0ba9-41d9-a780-b8b3705b963c-kube-api-access-9g8dc\") pod \"f8f42596-0ba9-41d9-a780-b8b3705b963c\" (UID: \"f8f42596-0ba9-41d9-a780-b8b3705b963c\") " Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.386696 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8f42596-0ba9-41d9-a780-b8b3705b963c-logs" (OuterVolumeSpecName: "logs") pod "f8f42596-0ba9-41d9-a780-b8b3705b963c" (UID: "f8f42596-0ba9-41d9-a780-b8b3705b963c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.408011 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8f42596-0ba9-41d9-a780-b8b3705b963c-kube-api-access-9g8dc" (OuterVolumeSpecName: "kube-api-access-9g8dc") pod "f8f42596-0ba9-41d9-a780-b8b3705b963c" (UID: "f8f42596-0ba9-41d9-a780-b8b3705b963c"). InnerVolumeSpecName "kube-api-access-9g8dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.428701 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8f42596-0ba9-41d9-a780-b8b3705b963c-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f8f42596-0ba9-41d9-a780-b8b3705b963c" (UID: "f8f42596-0ba9-41d9-a780-b8b3705b963c"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.467173 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8f42596-0ba9-41d9-a780-b8b3705b963c-config-data" (OuterVolumeSpecName: "config-data") pod "f8f42596-0ba9-41d9-a780-b8b3705b963c" (UID: "f8f42596-0ba9-41d9-a780-b8b3705b963c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.467929 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8f42596-0ba9-41d9-a780-b8b3705b963c-scripts" (OuterVolumeSpecName: "scripts") pod "f8f42596-0ba9-41d9-a780-b8b3705b963c" (UID: "f8f42596-0ba9-41d9-a780-b8b3705b963c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.488656 4760 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f8f42596-0ba9-41d9-a780-b8b3705b963c-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.488696 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8f42596-0ba9-41d9-a780-b8b3705b963c-logs\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.488709 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8f42596-0ba9-41d9-a780-b8b3705b963c-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.488719 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8f42596-0ba9-41d9-a780-b8b3705b963c-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.488731 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9g8dc\" (UniqueName: \"kubernetes.io/projected/f8f42596-0ba9-41d9-a780-b8b3705b963c-kube-api-access-9g8dc\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.538430 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5885b885f5-7v6mw" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.584153 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5885b885f5-7v6mw" event={"ID":"0d907232-c03b-44a4-a0b5-36ce5bf6d62b","Type":"ContainerDied","Data":"d582e6dc3553c1c0170ca92c054ff0aeafdeb1228139dbc5ada4853684bfec0b"} Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.584200 4760 scope.go:117] "RemoveContainer" containerID="31ed2dc075a4352931d059eb24b16e55e055f8e5fe67745ba3017d7bc673416e" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.584366 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5885b885f5-7v6mw" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.608784 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d547648b9-kxd9p" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.608901 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-746f884dcc-8lkjw" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.608777 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-746f884dcc-8lkjw" event={"ID":"f8f42596-0ba9-41d9-a780-b8b3705b963c","Type":"ContainerDied","Data":"75950071b231471cce5a7982d702a8b964c8a295529f8611a0263a0f53e2304e"} Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.648225 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a555f66-7027-4fac-afcc-db7b3f5ae034","Type":"ContainerStarted","Data":"9033f067ba6698c37fd74b64f06a530dad8c5baae6004962025e2030248511d8"} Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.692976 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsh25\" (UniqueName: \"kubernetes.io/projected/b7668a34-2cd6-4873-a55a-27839a612a2b-kube-api-access-xsh25\") pod \"b7668a34-2cd6-4873-a55a-27839a612a2b\" (UID: \"b7668a34-2cd6-4873-a55a-27839a612a2b\") " Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.693255 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-scripts\") pod \"0d907232-c03b-44a4-a0b5-36ce5bf6d62b\" (UID: \"0d907232-c03b-44a4-a0b5-36ce5bf6d62b\") " Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.693335 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b7668a34-2cd6-4873-a55a-27839a612a2b-horizon-secret-key\") pod \"b7668a34-2cd6-4873-a55a-27839a612a2b\" (UID: \"b7668a34-2cd6-4873-a55a-27839a612a2b\") " Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.693441 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-horizon-secret-key\") pod \"0d907232-c03b-44a4-a0b5-36ce5bf6d62b\" (UID: \"0d907232-c03b-44a4-a0b5-36ce5bf6d62b\") " Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.693516 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7668a34-2cd6-4873-a55a-27839a612a2b-scripts\") pod \"b7668a34-2cd6-4873-a55a-27839a612a2b\" (UID: \"b7668a34-2cd6-4873-a55a-27839a612a2b\") " Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.693553 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m75zc\" (UniqueName: \"kubernetes.io/projected/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-kube-api-access-m75zc\") pod \"0d907232-c03b-44a4-a0b5-36ce5bf6d62b\" (UID: \"0d907232-c03b-44a4-a0b5-36ce5bf6d62b\") " Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.693592 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-config-data\") pod \"0d907232-c03b-44a4-a0b5-36ce5bf6d62b\" (UID: \"0d907232-c03b-44a4-a0b5-36ce5bf6d62b\") " Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.693627 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7668a34-2cd6-4873-a55a-27839a612a2b-logs\") pod \"b7668a34-2cd6-4873-a55a-27839a612a2b\" (UID: \"b7668a34-2cd6-4873-a55a-27839a612a2b\") " Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.693702 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-logs\") pod \"0d907232-c03b-44a4-a0b5-36ce5bf6d62b\" (UID: \"0d907232-c03b-44a4-a0b5-36ce5bf6d62b\") " Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.693758 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b7668a34-2cd6-4873-a55a-27839a612a2b-config-data\") pod \"b7668a34-2cd6-4873-a55a-27839a612a2b\" (UID: \"b7668a34-2cd6-4873-a55a-27839a612a2b\") " Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.704369 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-logs" (OuterVolumeSpecName: "logs") pod "0d907232-c03b-44a4-a0b5-36ce5bf6d62b" (UID: "0d907232-c03b-44a4-a0b5-36ce5bf6d62b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.708578 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-kube-api-access-m75zc" (OuterVolumeSpecName: "kube-api-access-m75zc") pod "0d907232-c03b-44a4-a0b5-36ce5bf6d62b" (UID: "0d907232-c03b-44a4-a0b5-36ce5bf6d62b"). InnerVolumeSpecName "kube-api-access-m75zc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.710390 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7668a34-2cd6-4873-a55a-27839a612a2b-logs" (OuterVolumeSpecName: "logs") pod "b7668a34-2cd6-4873-a55a-27839a612a2b" (UID: "b7668a34-2cd6-4873-a55a-27839a612a2b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.713699 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "0d907232-c03b-44a4-a0b5-36ce5bf6d62b" (UID: "0d907232-c03b-44a4-a0b5-36ce5bf6d62b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.747451 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7668a34-2cd6-4873-a55a-27839a612a2b-kube-api-access-xsh25" (OuterVolumeSpecName: "kube-api-access-xsh25") pod "b7668a34-2cd6-4873-a55a-27839a612a2b" (UID: "b7668a34-2cd6-4873-a55a-27839a612a2b"). InnerVolumeSpecName "kube-api-access-xsh25". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.756801 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7668a34-2cd6-4873-a55a-27839a612a2b-config-data" (OuterVolumeSpecName: "config-data") pod "b7668a34-2cd6-4873-a55a-27839a612a2b" (UID: "b7668a34-2cd6-4873-a55a-27839a612a2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.756884 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7668a34-2cd6-4873-a55a-27839a612a2b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b7668a34-2cd6-4873-a55a-27839a612a2b" (UID: "b7668a34-2cd6-4873-a55a-27839a612a2b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.761106 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-scripts" (OuterVolumeSpecName: "scripts") pod "0d907232-c03b-44a4-a0b5-36ce5bf6d62b" (UID: "0d907232-c03b-44a4-a0b5-36ce5bf6d62b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.785408 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-746f884dcc-8lkjw"] Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.795547 4760 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.795578 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m75zc\" (UniqueName: \"kubernetes.io/projected/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-kube-api-access-m75zc\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.795587 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7668a34-2cd6-4873-a55a-27839a612a2b-logs\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.795595 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-logs\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.795603 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b7668a34-2cd6-4873-a55a-27839a612a2b-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.795612 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsh25\" (UniqueName: \"kubernetes.io/projected/b7668a34-2cd6-4873-a55a-27839a612a2b-kube-api-access-xsh25\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.795620 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.795628 4760 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b7668a34-2cd6-4873-a55a-27839a612a2b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.804509 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-746f884dcc-8lkjw"] Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.812422 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-65c7cb7cc8-cjqdk"] Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.814711 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-config-data" (OuterVolumeSpecName: "config-data") pod "0d907232-c03b-44a4-a0b5-36ce5bf6d62b" (UID: "0d907232-c03b-44a4-a0b5-36ce5bf6d62b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.833024 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7668a34-2cd6-4873-a55a-27839a612a2b-scripts" (OuterVolumeSpecName: "scripts") pod "b7668a34-2cd6-4873-a55a-27839a612a2b" (UID: "b7668a34-2cd6-4873-a55a-27839a612a2b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.844910 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.855496 4760 scope.go:117] "RemoveContainer" containerID="2e8e9f85e632b06032bb32e36e15e521539ad1bfeea1fcaa3c7fb720729d78cf" Sep 30 07:51:05 crc kubenswrapper[4760]: E0930 07:51:05.855517 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-9ml2s" podUID="4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0" Sep 30 07:51:05 crc kubenswrapper[4760]: W0930 07:51:05.861044 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59d36e9c_c1d2_4da4_be6b_ddfbc9fd76b3.slice/crio-9034dc75aad02d9f71fa31695bb31efc9914f5f10c86102196a3cde31aac3e43 WatchSource:0}: Error finding container 9034dc75aad02d9f71fa31695bb31efc9914f5f10c86102196a3cde31aac3e43: Status 404 returned error can't find the container with id 9034dc75aad02d9f71fa31695bb31efc9914f5f10c86102196a3cde31aac3e43 Sep 30 07:51:05 crc kubenswrapper[4760]: W0930 07:51:05.864523 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddec298fa_4de5_4a26_bc21_409707df4ddb.slice/crio-5be6b0e6208863a2eb05dc66dc0b70f59e0f635465838ba525a8e824b7a2cabe WatchSource:0}: Error finding container 5be6b0e6208863a2eb05dc66dc0b70f59e0f635465838ba525a8e824b7a2cabe: Status 404 returned error can't find the container with id 5be6b0e6208863a2eb05dc66dc0b70f59e0f635465838ba525a8e824b7a2cabe Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.897201 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7668a34-2cd6-4873-a55a-27839a612a2b-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.897845 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d907232-c03b-44a4-a0b5-36ce5bf6d62b-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.959860 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5885b885f5-7v6mw"] Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.966467 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5885b885f5-7v6mw"] Sep 30 07:51:05 crc kubenswrapper[4760]: I0930 07:51:05.969928 4760 scope.go:117] "RemoveContainer" containerID="0bdb08b78789e281874d3f19555926dccd223d02905679bb3a399172af61108a" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.049823 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-fcfdd6f9f-j5p4t" podUID="da7c2ace-093d-4279-bad0-5f2876f4ab8d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.149:5353: i/o timeout" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.409386 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-fvhgj"] Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.471469 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-2v8mj"] Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.481233 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c45496686-rqmqg"] Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.499824 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.522367 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-554bb7d464-zcqc8"] Sep 30 07:51:06 crc kubenswrapper[4760]: W0930 07:51:06.540208 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8adf3e3f_bdc8_4a02_90da_71e21926d9f9.slice/crio-62054de4f756785b3fa6792d180b878046f9c8309d18c2b080580160cedfe606 WatchSource:0}: Error finding container 62054de4f756785b3fa6792d180b878046f9c8309d18c2b080580160cedfe606: Status 404 returned error can't find the container with id 62054de4f756785b3fa6792d180b878046f9c8309d18c2b080580160cedfe606 Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.563133 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-8479dd9dbc-25wxx"] Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.587815 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-55ffd7b5b9-x7zhf"] Sep 30 07:51:06 crc kubenswrapper[4760]: E0930 07:51:06.588587 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7668a34-2cd6-4873-a55a-27839a612a2b" containerName="horizon" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.588607 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7668a34-2cd6-4873-a55a-27839a612a2b" containerName="horizon" Sep 30 07:51:06 crc kubenswrapper[4760]: E0930 07:51:06.588618 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d907232-c03b-44a4-a0b5-36ce5bf6d62b" containerName="horizon-log" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.588624 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d907232-c03b-44a4-a0b5-36ce5bf6d62b" containerName="horizon-log" Sep 30 07:51:06 crc kubenswrapper[4760]: E0930 07:51:06.588647 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7668a34-2cd6-4873-a55a-27839a612a2b" containerName="horizon-log" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.588652 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7668a34-2cd6-4873-a55a-27839a612a2b" containerName="horizon-log" Sep 30 07:51:06 crc kubenswrapper[4760]: E0930 07:51:06.588671 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8f42596-0ba9-41d9-a780-b8b3705b963c" containerName="horizon" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.588677 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8f42596-0ba9-41d9-a780-b8b3705b963c" containerName="horizon" Sep 30 07:51:06 crc kubenswrapper[4760]: E0930 07:51:06.588696 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d907232-c03b-44a4-a0b5-36ce5bf6d62b" containerName="horizon" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.588701 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d907232-c03b-44a4-a0b5-36ce5bf6d62b" containerName="horizon" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.589594 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d907232-c03b-44a4-a0b5-36ce5bf6d62b" containerName="horizon-log" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.589697 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7668a34-2cd6-4873-a55a-27839a612a2b" containerName="horizon" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.589775 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d907232-c03b-44a4-a0b5-36ce5bf6d62b" containerName="horizon" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.589866 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8f42596-0ba9-41d9-a780-b8b3705b963c" containerName="horizon" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.589949 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7668a34-2cd6-4873-a55a-27839a612a2b" containerName="horizon-log" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.591463 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55ffd7b5b9-x7zhf" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.598956 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.599404 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.614927 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-55ffd7b5b9-x7zhf"] Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.628852 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79bccb96b8-8wjx5"] Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.692193 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d66f584d7-fvhgj" event={"ID":"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60","Type":"ContainerStarted","Data":"222e521a60066eaabc93833a645c859556993cb854c2dda49649890d10bb1aab"} Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.696409 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8479dd9dbc-25wxx" event={"ID":"caf10164-5c77-42df-9fdc-b6a1764a0e3d","Type":"ContainerStarted","Data":"09701e8832729cea7c5471dd4d5e22009da502c06ea7344f394483b827ad5957"} Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.698191 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dec298fa-4de5-4a26-bc21-409707df4ddb","Type":"ContainerStarted","Data":"5be6b0e6208863a2eb05dc66dc0b70f59e0f635465838ba525a8e824b7a2cabe"} Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.702913 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c45496686-rqmqg" event={"ID":"8adf3e3f-bdc8-4a02-90da-71e21926d9f9","Type":"ContainerStarted","Data":"62054de4f756785b3fa6792d180b878046f9c8309d18c2b080580160cedfe606"} Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.705741 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d547648b9-kxd9p" event={"ID":"b7668a34-2cd6-4873-a55a-27839a612a2b","Type":"ContainerDied","Data":"57575924b02b4da28daeadfe08ce08924dc06c0caa5dd5f39dfbf68bcfa03848"} Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.705789 4760 scope.go:117] "RemoveContainer" containerID="c9f51e05fcba75797df060a5a1ac980cc089f7cde31e72b0efb1fc5117351629" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.705822 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d547648b9-kxd9p" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.707625 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-554bb7d464-zcqc8" event={"ID":"968a427d-0cee-4775-ab7f-4ec27e535b33","Type":"ContainerStarted","Data":"cb2cd0608dcffbdb82cdca131e54e03245eeb702481f2dca59de02414978b244"} Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.711328 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"79c24341-f615-4dcf-818f-e1c398e2504d","Type":"ContainerStarted","Data":"4d4a269cf126420b5923aaf62fce720197b1228bfca16675b960b999f0814682"} Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.719272 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2-config\") pod \"neutron-55ffd7b5b9-x7zhf\" (UID: \"6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2\") " pod="openstack/neutron-55ffd7b5b9-x7zhf" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.719421 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc46p\" (UniqueName: \"kubernetes.io/projected/6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2-kube-api-access-qc46p\") pod \"neutron-55ffd7b5b9-x7zhf\" (UID: \"6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2\") " pod="openstack/neutron-55ffd7b5b9-x7zhf" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.719462 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2-combined-ca-bundle\") pod \"neutron-55ffd7b5b9-x7zhf\" (UID: \"6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2\") " pod="openstack/neutron-55ffd7b5b9-x7zhf" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.719486 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2-httpd-config\") pod \"neutron-55ffd7b5b9-x7zhf\" (UID: \"6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2\") " pod="openstack/neutron-55ffd7b5b9-x7zhf" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.719508 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2-internal-tls-certs\") pod \"neutron-55ffd7b5b9-x7zhf\" (UID: \"6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2\") " pod="openstack/neutron-55ffd7b5b9-x7zhf" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.719553 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2-public-tls-certs\") pod \"neutron-55ffd7b5b9-x7zhf\" (UID: \"6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2\") " pod="openstack/neutron-55ffd7b5b9-x7zhf" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.719572 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2-ovndb-tls-certs\") pod \"neutron-55ffd7b5b9-x7zhf\" (UID: \"6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2\") " pod="openstack/neutron-55ffd7b5b9-x7zhf" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.720614 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-65c7cb7cc8-cjqdk" event={"ID":"59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3","Type":"ContainerStarted","Data":"7f51b21999305cca7e88a53ef7071a92ed0ecd4b1de897c3e8d7a23ac081a1b8"} Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.720655 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-65c7cb7cc8-cjqdk" event={"ID":"59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3","Type":"ContainerStarted","Data":"9034dc75aad02d9f71fa31695bb31efc9914f5f10c86102196a3cde31aac3e43"} Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.721605 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-65c7cb7cc8-cjqdk" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.725508 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" event={"ID":"c3f46a87-7a28-45d7-ad4d-98b0ce508557","Type":"ContainerStarted","Data":"630c32481d53679d7f64e2c8b914c9022b001eb01a3a11b8609cb34fe07ec6a0"} Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.728404 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bbf081e8-29d3-46ed-8474-e027d6c28c1d","Type":"ContainerStarted","Data":"15cfa7febc327d6ae0fa0a81ca83ab7a264a545a3590f385be1ee68176fe88f2"} Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.737107 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6d547648b9-kxd9p"] Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.751021 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6d547648b9-kxd9p"] Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.763525 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-65c7cb7cc8-cjqdk" podStartSLOduration=10.763508319 podStartE2EDuration="10.763508319s" podCreationTimestamp="2025-09-30 07:50:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:51:06.755476644 +0000 UTC m=+1052.398383066" watchObservedRunningTime="2025-09-30 07:51:06.763508319 +0000 UTC m=+1052.406414731" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.802612 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=20.802590075 podStartE2EDuration="20.802590075s" podCreationTimestamp="2025-09-30 07:50:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:51:06.78789909 +0000 UTC m=+1052.430805512" watchObservedRunningTime="2025-09-30 07:51:06.802590075 +0000 UTC m=+1052.445496487" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.821365 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc46p\" (UniqueName: \"kubernetes.io/projected/6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2-kube-api-access-qc46p\") pod \"neutron-55ffd7b5b9-x7zhf\" (UID: \"6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2\") " pod="openstack/neutron-55ffd7b5b9-x7zhf" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.821777 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2-combined-ca-bundle\") pod \"neutron-55ffd7b5b9-x7zhf\" (UID: \"6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2\") " pod="openstack/neutron-55ffd7b5b9-x7zhf" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.821802 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2-httpd-config\") pod \"neutron-55ffd7b5b9-x7zhf\" (UID: \"6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2\") " pod="openstack/neutron-55ffd7b5b9-x7zhf" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.821826 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2-internal-tls-certs\") pod \"neutron-55ffd7b5b9-x7zhf\" (UID: \"6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2\") " pod="openstack/neutron-55ffd7b5b9-x7zhf" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.821888 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2-public-tls-certs\") pod \"neutron-55ffd7b5b9-x7zhf\" (UID: \"6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2\") " pod="openstack/neutron-55ffd7b5b9-x7zhf" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.821904 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2-ovndb-tls-certs\") pod \"neutron-55ffd7b5b9-x7zhf\" (UID: \"6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2\") " pod="openstack/neutron-55ffd7b5b9-x7zhf" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.821992 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2-config\") pod \"neutron-55ffd7b5b9-x7zhf\" (UID: \"6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2\") " pod="openstack/neutron-55ffd7b5b9-x7zhf" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.828840 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2-config\") pod \"neutron-55ffd7b5b9-x7zhf\" (UID: \"6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2\") " pod="openstack/neutron-55ffd7b5b9-x7zhf" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.828858 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2-httpd-config\") pod \"neutron-55ffd7b5b9-x7zhf\" (UID: \"6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2\") " pod="openstack/neutron-55ffd7b5b9-x7zhf" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.828832 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2-internal-tls-certs\") pod \"neutron-55ffd7b5b9-x7zhf\" (UID: \"6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2\") " pod="openstack/neutron-55ffd7b5b9-x7zhf" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.829584 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2-combined-ca-bundle\") pod \"neutron-55ffd7b5b9-x7zhf\" (UID: \"6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2\") " pod="openstack/neutron-55ffd7b5b9-x7zhf" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.831029 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2-ovndb-tls-certs\") pod \"neutron-55ffd7b5b9-x7zhf\" (UID: \"6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2\") " pod="openstack/neutron-55ffd7b5b9-x7zhf" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.842702 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2-public-tls-certs\") pod \"neutron-55ffd7b5b9-x7zhf\" (UID: \"6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2\") " pod="openstack/neutron-55ffd7b5b9-x7zhf" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.844985 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc46p\" (UniqueName: \"kubernetes.io/projected/6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2-kube-api-access-qc46p\") pod \"neutron-55ffd7b5b9-x7zhf\" (UID: \"6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2\") " pod="openstack/neutron-55ffd7b5b9-x7zhf" Sep 30 07:51:06 crc kubenswrapper[4760]: I0930 07:51:06.941384 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55ffd7b5b9-x7zhf" Sep 30 07:51:07 crc kubenswrapper[4760]: I0930 07:51:07.022577 4760 scope.go:117] "RemoveContainer" containerID="7da8cf7bc3a971e417b13433c8641faa22c49b2b1b7fd3507892848e0fb76724" Sep 30 07:51:07 crc kubenswrapper[4760]: I0930 07:51:07.087146 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d907232-c03b-44a4-a0b5-36ce5bf6d62b" path="/var/lib/kubelet/pods/0d907232-c03b-44a4-a0b5-36ce5bf6d62b/volumes" Sep 30 07:51:07 crc kubenswrapper[4760]: I0930 07:51:07.088259 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7668a34-2cd6-4873-a55a-27839a612a2b" path="/var/lib/kubelet/pods/b7668a34-2cd6-4873-a55a-27839a612a2b/volumes" Sep 30 07:51:07 crc kubenswrapper[4760]: I0930 07:51:07.089099 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8f42596-0ba9-41d9-a780-b8b3705b963c" path="/var/lib/kubelet/pods/f8f42596-0ba9-41d9-a780-b8b3705b963c/volumes" Sep 30 07:51:07 crc kubenswrapper[4760]: I0930 07:51:07.288253 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-65c7cb7cc8-cjqdk" Sep 30 07:51:07 crc kubenswrapper[4760]: I0930 07:51:07.636448 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-75644c8bb4-wrsmv" Sep 30 07:51:07 crc kubenswrapper[4760]: I0930 07:51:07.774144 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-65c7cb7cc8-cjqdk" event={"ID":"59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3","Type":"ContainerStarted","Data":"24a9f9479dde30a2437461aafd3679c280b06feaba6c8e5017135acac15ec46d"} Sep 30 07:51:07 crc kubenswrapper[4760]: I0930 07:51:07.776507 4760 generic.go:334] "Generic (PLEG): container finished" podID="c3f46a87-7a28-45d7-ad4d-98b0ce508557" containerID="8b87c0f157d670eb6185eca30b743e8ba33ecf4a6f22073bd94584171aae30bd" exitCode=0 Sep 30 07:51:07 crc kubenswrapper[4760]: I0930 07:51:07.776561 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" event={"ID":"c3f46a87-7a28-45d7-ad4d-98b0ce508557","Type":"ContainerDied","Data":"8b87c0f157d670eb6185eca30b743e8ba33ecf4a6f22073bd94584171aae30bd"} Sep 30 07:51:07 crc kubenswrapper[4760]: I0930 07:51:07.784144 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79bccb96b8-8wjx5" event={"ID":"f3ec89e4-64de-42eb-959a-c064d716f0f3","Type":"ContainerStarted","Data":"6b934f7a242cca97ceaa8b52af243c4cd67feceb3bc8fcfd262e5081f8a11717"} Sep 30 07:51:07 crc kubenswrapper[4760]: I0930 07:51:07.784183 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79bccb96b8-8wjx5" event={"ID":"f3ec89e4-64de-42eb-959a-c064d716f0f3","Type":"ContainerStarted","Data":"c340b3235bd6b1753de898cb671d06b4037fecac4448f26d0d563dee3140e525"} Sep 30 07:51:07 crc kubenswrapper[4760]: I0930 07:51:07.785771 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dec298fa-4de5-4a26-bc21-409707df4ddb","Type":"ContainerStarted","Data":"593e37b4d6f18506afadd01f354e5c33e3943695339a78fd18717f7241e269a9"} Sep 30 07:51:07 crc kubenswrapper[4760]: I0930 07:51:07.787144 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c45496686-rqmqg" event={"ID":"8adf3e3f-bdc8-4a02-90da-71e21926d9f9","Type":"ContainerStarted","Data":"3876410d63cccf30eeaa1015e4141ae3a67152401d345ab60b426a37063339e5"} Sep 30 07:51:07 crc kubenswrapper[4760]: I0930 07:51:07.787164 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c45496686-rqmqg" event={"ID":"8adf3e3f-bdc8-4a02-90da-71e21926d9f9","Type":"ContainerStarted","Data":"96bb7a19fcd2050c43a88fbf461967d21c7eb055f9facdafc53015b05b558e94"} Sep 30 07:51:07 crc kubenswrapper[4760]: I0930 07:51:07.787405 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c45496686-rqmqg" Sep 30 07:51:07 crc kubenswrapper[4760]: I0930 07:51:07.787524 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c45496686-rqmqg" Sep 30 07:51:07 crc kubenswrapper[4760]: I0930 07:51:07.804769 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"79c24341-f615-4dcf-818f-e1c398e2504d","Type":"ContainerStarted","Data":"6894b1fedbc0809e01e3772afab90f9bc97c4178268918e5e8dd5b3ca1a22e67"} Sep 30 07:51:07 crc kubenswrapper[4760]: I0930 07:51:07.804809 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"79c24341-f615-4dcf-818f-e1c398e2504d","Type":"ContainerStarted","Data":"fa4b880d68114f66040a29cc05b254b39fd992e892a16a219dcc0339b46a25c5"} Sep 30 07:51:07 crc kubenswrapper[4760]: I0930 07:51:07.805793 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 07:51:07 crc kubenswrapper[4760]: I0930 07:51:07.815819 4760 generic.go:334] "Generic (PLEG): container finished" podID="ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60" containerID="52b656d8ac370fe81a114f2ae64afdf6a5152acb2989d50cae3603a9eaef716e" exitCode=0 Sep 30 07:51:07 crc kubenswrapper[4760]: I0930 07:51:07.816725 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d66f584d7-fvhgj" event={"ID":"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60","Type":"ContainerDied","Data":"52b656d8ac370fe81a114f2ae64afdf6a5152acb2989d50cae3603a9eaef716e"} Sep 30 07:51:07 crc kubenswrapper[4760]: I0930 07:51:07.820670 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="79c24341-f615-4dcf-818f-e1c398e2504d" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.175:9322/\": dial tcp 10.217.0.175:9322: connect: connection refused" Sep 30 07:51:07 crc kubenswrapper[4760]: I0930 07:51:07.938128 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=4.938105494 podStartE2EDuration="4.938105494s" podCreationTimestamp="2025-09-30 07:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:51:07.835020008 +0000 UTC m=+1053.477926440" watchObservedRunningTime="2025-09-30 07:51:07.938105494 +0000 UTC m=+1053.581011906" Sep 30 07:51:07 crc kubenswrapper[4760]: I0930 07:51:07.958658 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7c45496686-rqmqg" podStartSLOduration=13.958645118 podStartE2EDuration="13.958645118s" podCreationTimestamp="2025-09-30 07:50:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:51:07.870640725 +0000 UTC m=+1053.513547137" watchObservedRunningTime="2025-09-30 07:51:07.958645118 +0000 UTC m=+1053.601551530" Sep 30 07:51:07 crc kubenswrapper[4760]: I0930 07:51:07.990997 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-55ffd7b5b9-x7zhf"] Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.149067 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.479889 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d66f584d7-fvhgj" Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.590458 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqggl\" (UniqueName: \"kubernetes.io/projected/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-kube-api-access-mqggl\") pod \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\" (UID: \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\") " Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.590570 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-dns-swift-storage-0\") pod \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\" (UID: \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\") " Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.590604 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-ovsdbserver-nb\") pod \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\" (UID: \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\") " Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.590679 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-dns-svc\") pod \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\" (UID: \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\") " Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.590826 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-ovsdbserver-sb\") pod \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\" (UID: \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\") " Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.590869 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-config\") pod \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\" (UID: \"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60\") " Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.596709 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-kube-api-access-mqggl" (OuterVolumeSpecName: "kube-api-access-mqggl") pod "ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60" (UID: "ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60"). InnerVolumeSpecName "kube-api-access-mqggl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.624612 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-config" (OuterVolumeSpecName: "config") pod "ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60" (UID: "ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.625804 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60" (UID: "ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.628430 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60" (UID: "ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.648824 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60" (UID: "ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.651411 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60" (UID: "ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.692770 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.692800 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.692811 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqggl\" (UniqueName: \"kubernetes.io/projected/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-kube-api-access-mqggl\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.692823 4760 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.692833 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.692841 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.854760 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" event={"ID":"c3f46a87-7a28-45d7-ad4d-98b0ce508557","Type":"ContainerStarted","Data":"4a40042be6d960c45e1eefa8089a41c216ac486c34b1cb31e9864e2d389a219b"} Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.856322 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.859888 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79bccb96b8-8wjx5" event={"ID":"f3ec89e4-64de-42eb-959a-c064d716f0f3","Type":"ContainerStarted","Data":"6ebf877d72f8fe806749d8c4fb727fcb9bbcc7b73819d031594b52f988949d46"} Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.860746 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-79bccb96b8-8wjx5" Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.869391 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55ffd7b5b9-x7zhf" event={"ID":"6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2","Type":"ContainerStarted","Data":"a41309bc1b2e6f1150292ca0c558cb2c2eeb01b476dba4ba349850c814d82646"} Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.869431 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55ffd7b5b9-x7zhf" event={"ID":"6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2","Type":"ContainerStarted","Data":"60dfe2b58dcce0f815fb934e1d7ed7aef6eaffd6bea512365a8a0304b5115eb3"} Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.869440 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55ffd7b5b9-x7zhf" event={"ID":"6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2","Type":"ContainerStarted","Data":"a88681f33e3b1cb6722b7e4d579847c421a6c8543bb44d34343643643cb61a65"} Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.870071 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-55ffd7b5b9-x7zhf" Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.881485 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dec298fa-4de5-4a26-bc21-409707df4ddb","Type":"ContainerStarted","Data":"0e3510ee726f0a74838e5a325ae1a1d3dbee09c20c8722d032ba23ebd4c965ae"} Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.883785 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" podStartSLOduration=4.883761438 podStartE2EDuration="4.883761438s" podCreationTimestamp="2025-09-30 07:51:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:51:08.879849428 +0000 UTC m=+1054.522755840" watchObservedRunningTime="2025-09-30 07:51:08.883761438 +0000 UTC m=+1054.526667850" Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.885557 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d66f584d7-fvhgj" Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.888186 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d66f584d7-fvhgj" event={"ID":"ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60","Type":"ContainerDied","Data":"222e521a60066eaabc93833a645c859556993cb854c2dda49649890d10bb1aab"} Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.888221 4760 scope.go:117] "RemoveContainer" containerID="52b656d8ac370fe81a114f2ae64afdf6a5152acb2989d50cae3603a9eaef716e" Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.915255 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-55ffd7b5b9-x7zhf" podStartSLOduration=2.91523485 podStartE2EDuration="2.91523485s" podCreationTimestamp="2025-09-30 07:51:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:51:08.903969493 +0000 UTC m=+1054.546875905" watchObservedRunningTime="2025-09-30 07:51:08.91523485 +0000 UTC m=+1054.558141262" Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.929171 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.936148 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-79bccb96b8-8wjx5" podStartSLOduration=4.936124602 podStartE2EDuration="4.936124602s" podCreationTimestamp="2025-09-30 07:51:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:51:08.92427777 +0000 UTC m=+1054.567184182" watchObservedRunningTime="2025-09-30 07:51:08.936124602 +0000 UTC m=+1054.579031014" Sep 30 07:51:08 crc kubenswrapper[4760]: I0930 07:51:08.979068 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=20.979043165 podStartE2EDuration="20.979043165s" podCreationTimestamp="2025-09-30 07:50:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:51:08.953132335 +0000 UTC m=+1054.596038747" watchObservedRunningTime="2025-09-30 07:51:08.979043165 +0000 UTC m=+1054.621949577" Sep 30 07:51:09 crc kubenswrapper[4760]: I0930 07:51:09.023940 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-fvhgj"] Sep 30 07:51:09 crc kubenswrapper[4760]: I0930 07:51:09.037621 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-fvhgj"] Sep 30 07:51:09 crc kubenswrapper[4760]: I0930 07:51:09.083480 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60" path="/var/lib/kubelet/pods/ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60/volumes" Sep 30 07:51:09 crc kubenswrapper[4760]: I0930 07:51:09.872657 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-75644c8bb4-wrsmv" Sep 30 07:51:09 crc kubenswrapper[4760]: I0930 07:51:09.987256 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-694db87c64-qrwhp"] Sep 30 07:51:09 crc kubenswrapper[4760]: I0930 07:51:09.987410 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-694db87c64-qrwhp" podUID="e28d9015-ea18-4da0-bfd8-c2cc5dec4f37" containerName="horizon-log" containerID="cri-o://5d579c31c3d1a80d0194fb2ea0a7ddf6de3190ad92735e3f6c6c498a6b3f551b" gracePeriod=30 Sep 30 07:51:09 crc kubenswrapper[4760]: I0930 07:51:09.987753 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-694db87c64-qrwhp" podUID="e28d9015-ea18-4da0-bfd8-c2cc5dec4f37" containerName="horizon" containerID="cri-o://ffd16f7b799eef3e712c57eb371aed74561eea257b32f299da023db303acb61b" gracePeriod=30 Sep 30 07:51:10 crc kubenswrapper[4760]: I0930 07:51:10.001786 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-694db87c64-qrwhp" podUID="e28d9015-ea18-4da0-bfd8-c2cc5dec4f37" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Sep 30 07:51:11 crc kubenswrapper[4760]: I0930 07:51:11.006175 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-554bb7d464-zcqc8" event={"ID":"968a427d-0cee-4775-ab7f-4ec27e535b33","Type":"ContainerStarted","Data":"d287b5c9828c3903ea0d5034db53e29548a2f7277b3cbcaf1a4c61cccd6e3051"} Sep 30 07:51:11 crc kubenswrapper[4760]: I0930 07:51:11.006856 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-554bb7d464-zcqc8" event={"ID":"968a427d-0cee-4775-ab7f-4ec27e535b33","Type":"ContainerStarted","Data":"ff9a63797a05dd2aa60ea147ab32f0aa620be4d71718cd91e03c0625218d6a04"} Sep 30 07:51:11 crc kubenswrapper[4760]: I0930 07:51:11.014786 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8479dd9dbc-25wxx" event={"ID":"caf10164-5c77-42df-9fdc-b6a1764a0e3d","Type":"ContainerStarted","Data":"5c7c1a45aceed7881770f30cfde415ed5b3be9fa9c22eb848d695538a4162c9a"} Sep 30 07:51:11 crc kubenswrapper[4760]: I0930 07:51:11.014839 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8479dd9dbc-25wxx" event={"ID":"caf10164-5c77-42df-9fdc-b6a1764a0e3d","Type":"ContainerStarted","Data":"8a48a4154c8de3b8666b04046a2737ce4597f83cbaddc31f6889ec4c27b9d7f7"} Sep 30 07:51:11 crc kubenswrapper[4760]: I0930 07:51:11.014899 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 07:51:11 crc kubenswrapper[4760]: I0930 07:51:11.023851 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-554bb7d464-zcqc8" podStartSLOduration=13.523828889 podStartE2EDuration="17.023836021s" podCreationTimestamp="2025-09-30 07:50:54 +0000 UTC" firstStartedPulling="2025-09-30 07:51:06.570403519 +0000 UTC m=+1052.213309931" lastFinishedPulling="2025-09-30 07:51:10.070410661 +0000 UTC m=+1055.713317063" observedRunningTime="2025-09-30 07:51:11.020600239 +0000 UTC m=+1056.663506651" watchObservedRunningTime="2025-09-30 07:51:11.023836021 +0000 UTC m=+1056.666742433" Sep 30 07:51:11 crc kubenswrapper[4760]: I0930 07:51:11.047151 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-8479dd9dbc-25wxx" podStartSLOduration=13.520294529 podStartE2EDuration="17.047134645s" podCreationTimestamp="2025-09-30 07:50:54 +0000 UTC" firstStartedPulling="2025-09-30 07:51:06.551023345 +0000 UTC m=+1052.193929757" lastFinishedPulling="2025-09-30 07:51:10.077863471 +0000 UTC m=+1055.720769873" observedRunningTime="2025-09-30 07:51:11.036976086 +0000 UTC m=+1056.679882498" watchObservedRunningTime="2025-09-30 07:51:11.047134645 +0000 UTC m=+1056.690041057" Sep 30 07:51:11 crc kubenswrapper[4760]: I0930 07:51:11.369281 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Sep 30 07:51:13 crc kubenswrapper[4760]: I0930 07:51:13.433968 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-694db87c64-qrwhp" podUID="e28d9015-ea18-4da0-bfd8-c2cc5dec4f37" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:58350->10.217.0.157:8443: read: connection reset by peer" Sep 30 07:51:13 crc kubenswrapper[4760]: I0930 07:51:13.854058 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-65c7cb7cc8-cjqdk" Sep 30 07:51:13 crc kubenswrapper[4760]: I0930 07:51:13.929678 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Sep 30 07:51:13 crc kubenswrapper[4760]: I0930 07:51:13.939277 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Sep 30 07:51:14 crc kubenswrapper[4760]: I0930 07:51:14.048909 4760 generic.go:334] "Generic (PLEG): container finished" podID="e28d9015-ea18-4da0-bfd8-c2cc5dec4f37" containerID="ffd16f7b799eef3e712c57eb371aed74561eea257b32f299da023db303acb61b" exitCode=0 Sep 30 07:51:14 crc kubenswrapper[4760]: I0930 07:51:14.049523 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-694db87c64-qrwhp" event={"ID":"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37","Type":"ContainerDied","Data":"ffd16f7b799eef3e712c57eb371aed74561eea257b32f299da023db303acb61b"} Sep 30 07:51:14 crc kubenswrapper[4760]: I0930 07:51:14.063394 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Sep 30 07:51:14 crc kubenswrapper[4760]: I0930 07:51:14.246513 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-844b758db4-hzncj" Sep 30 07:51:14 crc kubenswrapper[4760]: I0930 07:51:14.266282 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-844b758db4-hzncj" Sep 30 07:51:14 crc kubenswrapper[4760]: I0930 07:51:14.327657 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-65c7cb7cc8-cjqdk" Sep 30 07:51:14 crc kubenswrapper[4760]: I0930 07:51:14.416359 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7c45496686-rqmqg"] Sep 30 07:51:14 crc kubenswrapper[4760]: I0930 07:51:14.416613 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7c45496686-rqmqg" podUID="8adf3e3f-bdc8-4a02-90da-71e21926d9f9" containerName="barbican-api-log" containerID="cri-o://96bb7a19fcd2050c43a88fbf461967d21c7eb055f9facdafc53015b05b558e94" gracePeriod=30 Sep 30 07:51:14 crc kubenswrapper[4760]: I0930 07:51:14.416975 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7c45496686-rqmqg" podUID="8adf3e3f-bdc8-4a02-90da-71e21926d9f9" containerName="barbican-api" containerID="cri-o://3876410d63cccf30eeaa1015e4141ae3a67152401d345ab60b426a37063339e5" gracePeriod=30 Sep 30 07:51:14 crc kubenswrapper[4760]: I0930 07:51:14.438533 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7c45496686-rqmqg" podUID="8adf3e3f-bdc8-4a02-90da-71e21926d9f9" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.173:9311/healthcheck\": EOF" Sep 30 07:51:14 crc kubenswrapper[4760]: I0930 07:51:14.438770 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7c45496686-rqmqg" podUID="8adf3e3f-bdc8-4a02-90da-71e21926d9f9" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.173:9311/healthcheck\": EOF" Sep 30 07:51:14 crc kubenswrapper[4760]: I0930 07:51:14.438848 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7c45496686-rqmqg" podUID="8adf3e3f-bdc8-4a02-90da-71e21926d9f9" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.173:9311/healthcheck\": EOF" Sep 30 07:51:14 crc kubenswrapper[4760]: I0930 07:51:14.438922 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7c45496686-rqmqg" podUID="8adf3e3f-bdc8-4a02-90da-71e21926d9f9" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.173:9311/healthcheck\": EOF" Sep 30 07:51:14 crc kubenswrapper[4760]: I0930 07:51:14.613534 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" Sep 30 07:51:14 crc kubenswrapper[4760]: I0930 07:51:14.669096 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-fl2rf"] Sep 30 07:51:14 crc kubenswrapper[4760]: I0930 07:51:14.669349 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" podUID="c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c" containerName="dnsmasq-dns" containerID="cri-o://b186dab5b4052d583a6463fe37374ed0f3bd809879e604685545e4398cd76d19" gracePeriod=10 Sep 30 07:51:14 crc kubenswrapper[4760]: I0930 07:51:14.717720 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-694db87c64-qrwhp" podUID="e28d9015-ea18-4da0-bfd8-c2cc5dec4f37" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.157:8443: connect: connection refused" Sep 30 07:51:14 crc kubenswrapper[4760]: I0930 07:51:14.749755 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-77987c8bb7-t2mw2" Sep 30 07:51:15 crc kubenswrapper[4760]: I0930 07:51:15.068869 4760 generic.go:334] "Generic (PLEG): container finished" podID="c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c" containerID="b186dab5b4052d583a6463fe37374ed0f3bd809879e604685545e4398cd76d19" exitCode=0 Sep 30 07:51:15 crc kubenswrapper[4760]: I0930 07:51:15.073356 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" event={"ID":"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c","Type":"ContainerDied","Data":"b186dab5b4052d583a6463fe37374ed0f3bd809879e604685545e4398cd76d19"} Sep 30 07:51:15 crc kubenswrapper[4760]: I0930 07:51:15.090534 4760 generic.go:334] "Generic (PLEG): container finished" podID="8adf3e3f-bdc8-4a02-90da-71e21926d9f9" containerID="96bb7a19fcd2050c43a88fbf461967d21c7eb055f9facdafc53015b05b558e94" exitCode=143 Sep 30 07:51:15 crc kubenswrapper[4760]: I0930 07:51:15.109555 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c45496686-rqmqg" event={"ID":"8adf3e3f-bdc8-4a02-90da-71e21926d9f9","Type":"ContainerDied","Data":"96bb7a19fcd2050c43a88fbf461967d21c7eb055f9facdafc53015b05b558e94"} Sep 30 07:51:16 crc kubenswrapper[4760]: I0930 07:51:16.709189 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 07:51:16 crc kubenswrapper[4760]: I0930 07:51:16.709594 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 07:51:16 crc kubenswrapper[4760]: I0930 07:51:16.709611 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 07:51:16 crc kubenswrapper[4760]: I0930 07:51:16.709624 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 07:51:16 crc kubenswrapper[4760]: I0930 07:51:16.751164 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 07:51:16 crc kubenswrapper[4760]: I0930 07:51:16.778469 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 07:51:18 crc kubenswrapper[4760]: I0930 07:51:18.718251 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" Sep 30 07:51:18 crc kubenswrapper[4760]: I0930 07:51:18.736711 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 07:51:18 crc kubenswrapper[4760]: I0930 07:51:18.736766 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 07:51:18 crc kubenswrapper[4760]: I0930 07:51:18.736778 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 07:51:18 crc kubenswrapper[4760]: I0930 07:51:18.736793 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 07:51:18 crc kubenswrapper[4760]: I0930 07:51:18.787977 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 07:51:18 crc kubenswrapper[4760]: I0930 07:51:18.796802 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 07:51:18 crc kubenswrapper[4760]: I0930 07:51:18.831410 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-ovsdbserver-nb\") pod \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\" (UID: \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\") " Sep 30 07:51:18 crc kubenswrapper[4760]: I0930 07:51:18.831498 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn6w7\" (UniqueName: \"kubernetes.io/projected/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-kube-api-access-hn6w7\") pod \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\" (UID: \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\") " Sep 30 07:51:18 crc kubenswrapper[4760]: I0930 07:51:18.831612 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-dns-swift-storage-0\") pod \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\" (UID: \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\") " Sep 30 07:51:18 crc kubenswrapper[4760]: I0930 07:51:18.831635 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-ovsdbserver-sb\") pod \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\" (UID: \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\") " Sep 30 07:51:18 crc kubenswrapper[4760]: I0930 07:51:18.831711 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-config\") pod \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\" (UID: \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\") " Sep 30 07:51:18 crc kubenswrapper[4760]: I0930 07:51:18.831759 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-dns-svc\") pod \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\" (UID: \"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c\") " Sep 30 07:51:18 crc kubenswrapper[4760]: I0930 07:51:18.839533 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-kube-api-access-hn6w7" (OuterVolumeSpecName: "kube-api-access-hn6w7") pod "c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c" (UID: "c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c"). InnerVolumeSpecName "kube-api-access-hn6w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:51:18 crc kubenswrapper[4760]: I0930 07:51:18.892653 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c" (UID: "c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:18 crc kubenswrapper[4760]: I0930 07:51:18.897899 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c" (UID: "c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:18 crc kubenswrapper[4760]: I0930 07:51:18.907970 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-config" (OuterVolumeSpecName: "config") pod "c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c" (UID: "c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:18 crc kubenswrapper[4760]: I0930 07:51:18.920867 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c" (UID: "c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:18 crc kubenswrapper[4760]: I0930 07:51:18.930793 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c" (UID: "c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:18 crc kubenswrapper[4760]: I0930 07:51:18.933779 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:18 crc kubenswrapper[4760]: I0930 07:51:18.933809 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:18 crc kubenswrapper[4760]: I0930 07:51:18.933824 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:18 crc kubenswrapper[4760]: I0930 07:51:18.933840 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn6w7\" (UniqueName: \"kubernetes.io/projected/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-kube-api-access-hn6w7\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:18 crc kubenswrapper[4760]: I0930 07:51:18.933853 4760 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:18 crc kubenswrapper[4760]: I0930 07:51:18.933864 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.112927 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.112980 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.132424 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" event={"ID":"c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c","Type":"ContainerDied","Data":"8cdc5ee661f4929bf9f0042161435f57608e5857963979360396e8870bd9a264"} Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.132751 4760 scope.go:117] "RemoveContainer" containerID="b186dab5b4052d583a6463fe37374ed0f3bd809879e604685545e4398cd76d19" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.132494 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.159942 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-fl2rf"] Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.168022 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-fl2rf"] Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.294568 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Sep 30 07:51:19 crc kubenswrapper[4760]: E0930 07:51:19.295438 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c" containerName="init" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.295467 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c" containerName="init" Sep 30 07:51:19 crc kubenswrapper[4760]: E0930 07:51:19.295486 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c" containerName="dnsmasq-dns" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.295494 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c" containerName="dnsmasq-dns" Sep 30 07:51:19 crc kubenswrapper[4760]: E0930 07:51:19.295531 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60" containerName="init" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.295540 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60" containerName="init" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.296020 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac8c51b1-6f4a-4674-bf9c-dd6fb12dbf60" containerName="init" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.296075 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c" containerName="dnsmasq-dns" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.312913 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.313019 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.316204 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.317675 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-spfwx" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.318407 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.402898 4760 scope.go:117] "RemoveContainer" containerID="887bc79c81f53af4078394c397831514beae81abf6e2314309f726c5d5db7a4a" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.444056 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mz4g\" (UniqueName: \"kubernetes.io/projected/1c5867a3-c734-489e-a6b3-edb023949556-kube-api-access-9mz4g\") pod \"openstackclient\" (UID: \"1c5867a3-c734-489e-a6b3-edb023949556\") " pod="openstack/openstackclient" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.444132 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5867a3-c734-489e-a6b3-edb023949556-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1c5867a3-c734-489e-a6b3-edb023949556\") " pod="openstack/openstackclient" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.444164 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1c5867a3-c734-489e-a6b3-edb023949556-openstack-config-secret\") pod \"openstackclient\" (UID: \"1c5867a3-c734-489e-a6b3-edb023949556\") " pod="openstack/openstackclient" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.444222 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1c5867a3-c734-489e-a6b3-edb023949556-openstack-config\") pod \"openstackclient\" (UID: \"1c5867a3-c734-489e-a6b3-edb023949556\") " pod="openstack/openstackclient" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.545360 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mz4g\" (UniqueName: \"kubernetes.io/projected/1c5867a3-c734-489e-a6b3-edb023949556-kube-api-access-9mz4g\") pod \"openstackclient\" (UID: \"1c5867a3-c734-489e-a6b3-edb023949556\") " pod="openstack/openstackclient" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.545656 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5867a3-c734-489e-a6b3-edb023949556-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1c5867a3-c734-489e-a6b3-edb023949556\") " pod="openstack/openstackclient" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.545690 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1c5867a3-c734-489e-a6b3-edb023949556-openstack-config-secret\") pod \"openstackclient\" (UID: \"1c5867a3-c734-489e-a6b3-edb023949556\") " pod="openstack/openstackclient" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.545731 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1c5867a3-c734-489e-a6b3-edb023949556-openstack-config\") pod \"openstackclient\" (UID: \"1c5867a3-c734-489e-a6b3-edb023949556\") " pod="openstack/openstackclient" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.548523 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1c5867a3-c734-489e-a6b3-edb023949556-openstack-config\") pod \"openstackclient\" (UID: \"1c5867a3-c734-489e-a6b3-edb023949556\") " pod="openstack/openstackclient" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.552722 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1c5867a3-c734-489e-a6b3-edb023949556-openstack-config-secret\") pod \"openstackclient\" (UID: \"1c5867a3-c734-489e-a6b3-edb023949556\") " pod="openstack/openstackclient" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.553712 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5867a3-c734-489e-a6b3-edb023949556-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1c5867a3-c734-489e-a6b3-edb023949556\") " pod="openstack/openstackclient" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.565756 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mz4g\" (UniqueName: \"kubernetes.io/projected/1c5867a3-c734-489e-a6b3-edb023949556-kube-api-access-9mz4g\") pod \"openstackclient\" (UID: \"1c5867a3-c734-489e-a6b3-edb023949556\") " pod="openstack/openstackclient" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.635938 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 07:51:19 crc kubenswrapper[4760]: E0930 07:51:19.701098 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="7a555f66-7027-4fac-afcc-db7b3f5ae034" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.928326 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7c45496686-rqmqg" podUID="8adf3e3f-bdc8-4a02-90da-71e21926d9f9" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.173:9311/healthcheck\": read tcp 10.217.0.2:48518->10.217.0.173:9311: read: connection reset by peer" Sep 30 07:51:19 crc kubenswrapper[4760]: I0930 07:51:19.928359 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7c45496686-rqmqg" podUID="8adf3e3f-bdc8-4a02-90da-71e21926d9f9" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.173:9311/healthcheck\": read tcp 10.217.0.2:48516->10.217.0.173:9311: read: connection reset by peer" Sep 30 07:51:20 crc kubenswrapper[4760]: I0930 07:51:20.136295 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 07:51:20 crc kubenswrapper[4760]: I0930 07:51:20.153247 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7a555f66-7027-4fac-afcc-db7b3f5ae034" containerName="ceilometer-notification-agent" containerID="cri-o://d4afbc82bd27bf72ecee192b66554d47b2560bf5bd028bf7135ba8e44fb51690" gracePeriod=30 Sep 30 07:51:20 crc kubenswrapper[4760]: I0930 07:51:20.153513 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a555f66-7027-4fac-afcc-db7b3f5ae034","Type":"ContainerStarted","Data":"20ed58b97ce05b8e03f3a565b7ba2a7bc2561cbe31c4d49a609cf552b977b568"} Sep 30 07:51:20 crc kubenswrapper[4760]: I0930 07:51:20.153558 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7a555f66-7027-4fac-afcc-db7b3f5ae034" containerName="proxy-httpd" containerID="cri-o://20ed58b97ce05b8e03f3a565b7ba2a7bc2561cbe31c4d49a609cf552b977b568" gracePeriod=30 Sep 30 07:51:20 crc kubenswrapper[4760]: I0930 07:51:20.153613 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7a555f66-7027-4fac-afcc-db7b3f5ae034" containerName="sg-core" containerID="cri-o://9033f067ba6698c37fd74b64f06a530dad8c5baae6004962025e2030248511d8" gracePeriod=30 Sep 30 07:51:20 crc kubenswrapper[4760]: I0930 07:51:20.153618 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 07:51:20 crc kubenswrapper[4760]: I0930 07:51:20.191956 4760 generic.go:334] "Generic (PLEG): container finished" podID="8adf3e3f-bdc8-4a02-90da-71e21926d9f9" containerID="3876410d63cccf30eeaa1015e4141ae3a67152401d345ab60b426a37063339e5" exitCode=0 Sep 30 07:51:20 crc kubenswrapper[4760]: I0930 07:51:20.192845 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c45496686-rqmqg" event={"ID":"8adf3e3f-bdc8-4a02-90da-71e21926d9f9","Type":"ContainerDied","Data":"3876410d63cccf30eeaa1015e4141ae3a67152401d345ab60b426a37063339e5"} Sep 30 07:51:20 crc kubenswrapper[4760]: I0930 07:51:20.389154 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c45496686-rqmqg" Sep 30 07:51:20 crc kubenswrapper[4760]: I0930 07:51:20.464928 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-combined-ca-bundle\") pod \"8adf3e3f-bdc8-4a02-90da-71e21926d9f9\" (UID: \"8adf3e3f-bdc8-4a02-90da-71e21926d9f9\") " Sep 30 07:51:20 crc kubenswrapper[4760]: I0930 07:51:20.465041 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-logs\") pod \"8adf3e3f-bdc8-4a02-90da-71e21926d9f9\" (UID: \"8adf3e3f-bdc8-4a02-90da-71e21926d9f9\") " Sep 30 07:51:20 crc kubenswrapper[4760]: I0930 07:51:20.465142 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-config-data\") pod \"8adf3e3f-bdc8-4a02-90da-71e21926d9f9\" (UID: \"8adf3e3f-bdc8-4a02-90da-71e21926d9f9\") " Sep 30 07:51:20 crc kubenswrapper[4760]: I0930 07:51:20.465169 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9rcs\" (UniqueName: \"kubernetes.io/projected/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-kube-api-access-t9rcs\") pod \"8adf3e3f-bdc8-4a02-90da-71e21926d9f9\" (UID: \"8adf3e3f-bdc8-4a02-90da-71e21926d9f9\") " Sep 30 07:51:20 crc kubenswrapper[4760]: I0930 07:51:20.465203 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-config-data-custom\") pod \"8adf3e3f-bdc8-4a02-90da-71e21926d9f9\" (UID: \"8adf3e3f-bdc8-4a02-90da-71e21926d9f9\") " Sep 30 07:51:20 crc kubenswrapper[4760]: I0930 07:51:20.466133 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-logs" (OuterVolumeSpecName: "logs") pod "8adf3e3f-bdc8-4a02-90da-71e21926d9f9" (UID: "8adf3e3f-bdc8-4a02-90da-71e21926d9f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:51:20 crc kubenswrapper[4760]: I0930 07:51:20.470825 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8adf3e3f-bdc8-4a02-90da-71e21926d9f9" (UID: "8adf3e3f-bdc8-4a02-90da-71e21926d9f9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:20 crc kubenswrapper[4760]: I0930 07:51:20.481071 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-kube-api-access-t9rcs" (OuterVolumeSpecName: "kube-api-access-t9rcs") pod "8adf3e3f-bdc8-4a02-90da-71e21926d9f9" (UID: "8adf3e3f-bdc8-4a02-90da-71e21926d9f9"). InnerVolumeSpecName "kube-api-access-t9rcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:51:20 crc kubenswrapper[4760]: I0930 07:51:20.497244 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8adf3e3f-bdc8-4a02-90da-71e21926d9f9" (UID: "8adf3e3f-bdc8-4a02-90da-71e21926d9f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:20 crc kubenswrapper[4760]: I0930 07:51:20.536288 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 07:51:20 crc kubenswrapper[4760]: I0930 07:51:20.536552 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-config-data" (OuterVolumeSpecName: "config-data") pod "8adf3e3f-bdc8-4a02-90da-71e21926d9f9" (UID: "8adf3e3f-bdc8-4a02-90da-71e21926d9f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:20 crc kubenswrapper[4760]: I0930 07:51:20.536714 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 07:51:20 crc kubenswrapper[4760]: I0930 07:51:20.567853 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9rcs\" (UniqueName: \"kubernetes.io/projected/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-kube-api-access-t9rcs\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:20 crc kubenswrapper[4760]: I0930 07:51:20.567880 4760 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:20 crc kubenswrapper[4760]: I0930 07:51:20.567890 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:20 crc kubenswrapper[4760]: I0930 07:51:20.567903 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-logs\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:20 crc kubenswrapper[4760]: I0930 07:51:20.567914 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8adf3e3f-bdc8-4a02-90da-71e21926d9f9-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:21 crc kubenswrapper[4760]: I0930 07:51:21.079098 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c" path="/var/lib/kubelet/pods/c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c/volumes" Sep 30 07:51:21 crc kubenswrapper[4760]: I0930 07:51:21.213556 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c45496686-rqmqg" event={"ID":"8adf3e3f-bdc8-4a02-90da-71e21926d9f9","Type":"ContainerDied","Data":"62054de4f756785b3fa6792d180b878046f9c8309d18c2b080580160cedfe606"} Sep 30 07:51:21 crc kubenswrapper[4760]: I0930 07:51:21.213583 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c45496686-rqmqg" Sep 30 07:51:21 crc kubenswrapper[4760]: I0930 07:51:21.213788 4760 scope.go:117] "RemoveContainer" containerID="3876410d63cccf30eeaa1015e4141ae3a67152401d345ab60b426a37063339e5" Sep 30 07:51:21 crc kubenswrapper[4760]: I0930 07:51:21.215790 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9ml2s" event={"ID":"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0","Type":"ContainerStarted","Data":"da79e51b47f98778727a550fc3e1167b95d280a9cb76165601bb82b5c73cf3f9"} Sep 30 07:51:21 crc kubenswrapper[4760]: I0930 07:51:21.222428 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"1c5867a3-c734-489e-a6b3-edb023949556","Type":"ContainerStarted","Data":"5e5b211962af2c705c3af3546e472c7d259a1f9d8d04006d9bd02ca4d64cdd79"} Sep 30 07:51:21 crc kubenswrapper[4760]: I0930 07:51:21.240527 4760 generic.go:334] "Generic (PLEG): container finished" podID="7a555f66-7027-4fac-afcc-db7b3f5ae034" containerID="20ed58b97ce05b8e03f3a565b7ba2a7bc2561cbe31c4d49a609cf552b977b568" exitCode=0 Sep 30 07:51:21 crc kubenswrapper[4760]: I0930 07:51:21.240562 4760 generic.go:334] "Generic (PLEG): container finished" podID="7a555f66-7027-4fac-afcc-db7b3f5ae034" containerID="9033f067ba6698c37fd74b64f06a530dad8c5baae6004962025e2030248511d8" exitCode=2 Sep 30 07:51:21 crc kubenswrapper[4760]: I0930 07:51:21.240581 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a555f66-7027-4fac-afcc-db7b3f5ae034","Type":"ContainerDied","Data":"20ed58b97ce05b8e03f3a565b7ba2a7bc2561cbe31c4d49a609cf552b977b568"} Sep 30 07:51:21 crc kubenswrapper[4760]: I0930 07:51:21.240606 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a555f66-7027-4fac-afcc-db7b3f5ae034","Type":"ContainerDied","Data":"9033f067ba6698c37fd74b64f06a530dad8c5baae6004962025e2030248511d8"} Sep 30 07:51:21 crc kubenswrapper[4760]: I0930 07:51:21.245293 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-9ml2s" podStartSLOduration=8.318665634 podStartE2EDuration="54.245277159s" podCreationTimestamp="2025-09-30 07:50:27 +0000 UTC" firstStartedPulling="2025-09-30 07:50:33.541154717 +0000 UTC m=+1019.184061119" lastFinishedPulling="2025-09-30 07:51:19.467766242 +0000 UTC m=+1065.110672644" observedRunningTime="2025-09-30 07:51:21.235091919 +0000 UTC m=+1066.877998321" watchObservedRunningTime="2025-09-30 07:51:21.245277159 +0000 UTC m=+1066.888183571" Sep 30 07:51:21 crc kubenswrapper[4760]: I0930 07:51:21.266127 4760 scope.go:117] "RemoveContainer" containerID="96bb7a19fcd2050c43a88fbf461967d21c7eb055f9facdafc53015b05b558e94" Sep 30 07:51:21 crc kubenswrapper[4760]: I0930 07:51:21.280022 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7c45496686-rqmqg"] Sep 30 07:51:21 crc kubenswrapper[4760]: I0930 07:51:21.290687 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7c45496686-rqmqg"] Sep 30 07:51:21 crc kubenswrapper[4760]: I0930 07:51:21.402955 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 07:51:21 crc kubenswrapper[4760]: I0930 07:51:21.403042 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 07:51:21 crc kubenswrapper[4760]: I0930 07:51:21.462400 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.024223 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6cc97c56c5-7pkjn"] Sep 30 07:51:23 crc kubenswrapper[4760]: E0930 07:51:23.025126 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8adf3e3f-bdc8-4a02-90da-71e21926d9f9" containerName="barbican-api-log" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.025140 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="8adf3e3f-bdc8-4a02-90da-71e21926d9f9" containerName="barbican-api-log" Sep 30 07:51:23 crc kubenswrapper[4760]: E0930 07:51:23.025177 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8adf3e3f-bdc8-4a02-90da-71e21926d9f9" containerName="barbican-api" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.025182 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="8adf3e3f-bdc8-4a02-90da-71e21926d9f9" containerName="barbican-api" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.025360 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="8adf3e3f-bdc8-4a02-90da-71e21926d9f9" containerName="barbican-api" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.025393 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="8adf3e3f-bdc8-4a02-90da-71e21926d9f9" containerName="barbican-api-log" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.026395 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.028007 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.028812 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.028986 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.029137 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.049631 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6cc97c56c5-7pkjn"] Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.094836 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8adf3e3f-bdc8-4a02-90da-71e21926d9f9" path="/var/lib/kubelet/pods/8adf3e3f-bdc8-4a02-90da-71e21926d9f9/volumes" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.116989 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a555f66-7027-4fac-afcc-db7b3f5ae034-combined-ca-bundle\") pod \"7a555f66-7027-4fac-afcc-db7b3f5ae034\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.117192 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a555f66-7027-4fac-afcc-db7b3f5ae034-scripts\") pod \"7a555f66-7027-4fac-afcc-db7b3f5ae034\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.117313 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggb9w\" (UniqueName: \"kubernetes.io/projected/7a555f66-7027-4fac-afcc-db7b3f5ae034-kube-api-access-ggb9w\") pod \"7a555f66-7027-4fac-afcc-db7b3f5ae034\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.117397 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a555f66-7027-4fac-afcc-db7b3f5ae034-config-data\") pod \"7a555f66-7027-4fac-afcc-db7b3f5ae034\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.117422 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a555f66-7027-4fac-afcc-db7b3f5ae034-sg-core-conf-yaml\") pod \"7a555f66-7027-4fac-afcc-db7b3f5ae034\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.117470 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a555f66-7027-4fac-afcc-db7b3f5ae034-log-httpd\") pod \"7a555f66-7027-4fac-afcc-db7b3f5ae034\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.117524 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a555f66-7027-4fac-afcc-db7b3f5ae034-run-httpd\") pod \"7a555f66-7027-4fac-afcc-db7b3f5ae034\" (UID: \"7a555f66-7027-4fac-afcc-db7b3f5ae034\") " Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.117945 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fc788440-e748-4b41-bdb6-23a6764062fd-etc-swift\") pod \"swift-proxy-6cc97c56c5-7pkjn\" (UID: \"fc788440-e748-4b41-bdb6-23a6764062fd\") " pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.118009 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc788440-e748-4b41-bdb6-23a6764062fd-public-tls-certs\") pod \"swift-proxy-6cc97c56c5-7pkjn\" (UID: \"fc788440-e748-4b41-bdb6-23a6764062fd\") " pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.118105 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc788440-e748-4b41-bdb6-23a6764062fd-internal-tls-certs\") pod \"swift-proxy-6cc97c56c5-7pkjn\" (UID: \"fc788440-e748-4b41-bdb6-23a6764062fd\") " pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.118139 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc788440-e748-4b41-bdb6-23a6764062fd-log-httpd\") pod \"swift-proxy-6cc97c56c5-7pkjn\" (UID: \"fc788440-e748-4b41-bdb6-23a6764062fd\") " pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.118164 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc788440-e748-4b41-bdb6-23a6764062fd-config-data\") pod \"swift-proxy-6cc97c56c5-7pkjn\" (UID: \"fc788440-e748-4b41-bdb6-23a6764062fd\") " pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.118362 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a555f66-7027-4fac-afcc-db7b3f5ae034-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7a555f66-7027-4fac-afcc-db7b3f5ae034" (UID: "7a555f66-7027-4fac-afcc-db7b3f5ae034"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.118475 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc788440-e748-4b41-bdb6-23a6764062fd-run-httpd\") pod \"swift-proxy-6cc97c56c5-7pkjn\" (UID: \"fc788440-e748-4b41-bdb6-23a6764062fd\") " pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.118512 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vd2b\" (UniqueName: \"kubernetes.io/projected/fc788440-e748-4b41-bdb6-23a6764062fd-kube-api-access-6vd2b\") pod \"swift-proxy-6cc97c56c5-7pkjn\" (UID: \"fc788440-e748-4b41-bdb6-23a6764062fd\") " pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.118684 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a555f66-7027-4fac-afcc-db7b3f5ae034-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7a555f66-7027-4fac-afcc-db7b3f5ae034" (UID: "7a555f66-7027-4fac-afcc-db7b3f5ae034"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.118878 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc788440-e748-4b41-bdb6-23a6764062fd-combined-ca-bundle\") pod \"swift-proxy-6cc97c56c5-7pkjn\" (UID: \"fc788440-e748-4b41-bdb6-23a6764062fd\") " pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.119099 4760 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a555f66-7027-4fac-afcc-db7b3f5ae034-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.119176 4760 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a555f66-7027-4fac-afcc-db7b3f5ae034-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.123004 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a555f66-7027-4fac-afcc-db7b3f5ae034-kube-api-access-ggb9w" (OuterVolumeSpecName: "kube-api-access-ggb9w") pod "7a555f66-7027-4fac-afcc-db7b3f5ae034" (UID: "7a555f66-7027-4fac-afcc-db7b3f5ae034"). InnerVolumeSpecName "kube-api-access-ggb9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.123942 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a555f66-7027-4fac-afcc-db7b3f5ae034-scripts" (OuterVolumeSpecName: "scripts") pod "7a555f66-7027-4fac-afcc-db7b3f5ae034" (UID: "7a555f66-7027-4fac-afcc-db7b3f5ae034"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.193700 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a555f66-7027-4fac-afcc-db7b3f5ae034-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7a555f66-7027-4fac-afcc-db7b3f5ae034" (UID: "7a555f66-7027-4fac-afcc-db7b3f5ae034"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.209474 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a555f66-7027-4fac-afcc-db7b3f5ae034-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a555f66-7027-4fac-afcc-db7b3f5ae034" (UID: "7a555f66-7027-4fac-afcc-db7b3f5ae034"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.221057 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc788440-e748-4b41-bdb6-23a6764062fd-combined-ca-bundle\") pod \"swift-proxy-6cc97c56c5-7pkjn\" (UID: \"fc788440-e748-4b41-bdb6-23a6764062fd\") " pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.221115 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fc788440-e748-4b41-bdb6-23a6764062fd-etc-swift\") pod \"swift-proxy-6cc97c56c5-7pkjn\" (UID: \"fc788440-e748-4b41-bdb6-23a6764062fd\") " pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.221140 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc788440-e748-4b41-bdb6-23a6764062fd-public-tls-certs\") pod \"swift-proxy-6cc97c56c5-7pkjn\" (UID: \"fc788440-e748-4b41-bdb6-23a6764062fd\") " pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.221178 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc788440-e748-4b41-bdb6-23a6764062fd-internal-tls-certs\") pod \"swift-proxy-6cc97c56c5-7pkjn\" (UID: \"fc788440-e748-4b41-bdb6-23a6764062fd\") " pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.221194 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc788440-e748-4b41-bdb6-23a6764062fd-log-httpd\") pod \"swift-proxy-6cc97c56c5-7pkjn\" (UID: \"fc788440-e748-4b41-bdb6-23a6764062fd\") " pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.221210 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc788440-e748-4b41-bdb6-23a6764062fd-config-data\") pod \"swift-proxy-6cc97c56c5-7pkjn\" (UID: \"fc788440-e748-4b41-bdb6-23a6764062fd\") " pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.221292 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vd2b\" (UniqueName: \"kubernetes.io/projected/fc788440-e748-4b41-bdb6-23a6764062fd-kube-api-access-6vd2b\") pod \"swift-proxy-6cc97c56c5-7pkjn\" (UID: \"fc788440-e748-4b41-bdb6-23a6764062fd\") " pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.221364 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc788440-e748-4b41-bdb6-23a6764062fd-run-httpd\") pod \"swift-proxy-6cc97c56c5-7pkjn\" (UID: \"fc788440-e748-4b41-bdb6-23a6764062fd\") " pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.221443 4760 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a555f66-7027-4fac-afcc-db7b3f5ae034-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.221455 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a555f66-7027-4fac-afcc-db7b3f5ae034-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.221465 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a555f66-7027-4fac-afcc-db7b3f5ae034-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.221473 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggb9w\" (UniqueName: \"kubernetes.io/projected/7a555f66-7027-4fac-afcc-db7b3f5ae034-kube-api-access-ggb9w\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.221858 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc788440-e748-4b41-bdb6-23a6764062fd-run-httpd\") pod \"swift-proxy-6cc97c56c5-7pkjn\" (UID: \"fc788440-e748-4b41-bdb6-23a6764062fd\") " pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.223562 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc788440-e748-4b41-bdb6-23a6764062fd-log-httpd\") pod \"swift-proxy-6cc97c56c5-7pkjn\" (UID: \"fc788440-e748-4b41-bdb6-23a6764062fd\") " pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.228194 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc788440-e748-4b41-bdb6-23a6764062fd-combined-ca-bundle\") pod \"swift-proxy-6cc97c56c5-7pkjn\" (UID: \"fc788440-e748-4b41-bdb6-23a6764062fd\") " pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.228215 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc788440-e748-4b41-bdb6-23a6764062fd-internal-tls-certs\") pod \"swift-proxy-6cc97c56c5-7pkjn\" (UID: \"fc788440-e748-4b41-bdb6-23a6764062fd\") " pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.229797 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc788440-e748-4b41-bdb6-23a6764062fd-config-data\") pod \"swift-proxy-6cc97c56c5-7pkjn\" (UID: \"fc788440-e748-4b41-bdb6-23a6764062fd\") " pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.229885 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fc788440-e748-4b41-bdb6-23a6764062fd-etc-swift\") pod \"swift-proxy-6cc97c56c5-7pkjn\" (UID: \"fc788440-e748-4b41-bdb6-23a6764062fd\") " pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.229953 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc788440-e748-4b41-bdb6-23a6764062fd-public-tls-certs\") pod \"swift-proxy-6cc97c56c5-7pkjn\" (UID: \"fc788440-e748-4b41-bdb6-23a6764062fd\") " pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.235201 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a555f66-7027-4fac-afcc-db7b3f5ae034-config-data" (OuterVolumeSpecName: "config-data") pod "7a555f66-7027-4fac-afcc-db7b3f5ae034" (UID: "7a555f66-7027-4fac-afcc-db7b3f5ae034"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.242063 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vd2b\" (UniqueName: \"kubernetes.io/projected/fc788440-e748-4b41-bdb6-23a6764062fd-kube-api-access-6vd2b\") pod \"swift-proxy-6cc97c56c5-7pkjn\" (UID: \"fc788440-e748-4b41-bdb6-23a6764062fd\") " pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.279913 4760 generic.go:334] "Generic (PLEG): container finished" podID="7a555f66-7027-4fac-afcc-db7b3f5ae034" containerID="d4afbc82bd27bf72ecee192b66554d47b2560bf5bd028bf7135ba8e44fb51690" exitCode=0 Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.279982 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a555f66-7027-4fac-afcc-db7b3f5ae034","Type":"ContainerDied","Data":"d4afbc82bd27bf72ecee192b66554d47b2560bf5bd028bf7135ba8e44fb51690"} Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.279999 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.280014 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a555f66-7027-4fac-afcc-db7b3f5ae034","Type":"ContainerDied","Data":"8584c4a706f72fba480b97ce1815c189ef7ce65141c8eb80c3ce669d07421c93"} Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.280033 4760 scope.go:117] "RemoveContainer" containerID="20ed58b97ce05b8e03f3a565b7ba2a7bc2561cbe31c4d49a609cf552b977b568" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.317229 4760 scope.go:117] "RemoveContainer" containerID="9033f067ba6698c37fd74b64f06a530dad8c5baae6004962025e2030248511d8" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.326384 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a555f66-7027-4fac-afcc-db7b3f5ae034-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.344419 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.348032 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.356030 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.362355 4760 scope.go:117] "RemoveContainer" containerID="d4afbc82bd27bf72ecee192b66554d47b2560bf5bd028bf7135ba8e44fb51690" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.364282 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:51:23 crc kubenswrapper[4760]: E0930 07:51:23.365211 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a555f66-7027-4fac-afcc-db7b3f5ae034" containerName="ceilometer-notification-agent" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.365238 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a555f66-7027-4fac-afcc-db7b3f5ae034" containerName="ceilometer-notification-agent" Sep 30 07:51:23 crc kubenswrapper[4760]: E0930 07:51:23.365276 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a555f66-7027-4fac-afcc-db7b3f5ae034" containerName="sg-core" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.365284 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a555f66-7027-4fac-afcc-db7b3f5ae034" containerName="sg-core" Sep 30 07:51:23 crc kubenswrapper[4760]: E0930 07:51:23.365292 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a555f66-7027-4fac-afcc-db7b3f5ae034" containerName="proxy-httpd" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.365311 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a555f66-7027-4fac-afcc-db7b3f5ae034" containerName="proxy-httpd" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.365467 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a555f66-7027-4fac-afcc-db7b3f5ae034" containerName="sg-core" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.365487 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a555f66-7027-4fac-afcc-db7b3f5ae034" containerName="ceilometer-notification-agent" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.365500 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a555f66-7027-4fac-afcc-db7b3f5ae034" containerName="proxy-httpd" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.367104 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.370646 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.370818 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.374748 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.419081 4760 scope.go:117] "RemoveContainer" containerID="20ed58b97ce05b8e03f3a565b7ba2a7bc2561cbe31c4d49a609cf552b977b568" Sep 30 07:51:23 crc kubenswrapper[4760]: E0930 07:51:23.420954 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20ed58b97ce05b8e03f3a565b7ba2a7bc2561cbe31c4d49a609cf552b977b568\": container with ID starting with 20ed58b97ce05b8e03f3a565b7ba2a7bc2561cbe31c4d49a609cf552b977b568 not found: ID does not exist" containerID="20ed58b97ce05b8e03f3a565b7ba2a7bc2561cbe31c4d49a609cf552b977b568" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.420999 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20ed58b97ce05b8e03f3a565b7ba2a7bc2561cbe31c4d49a609cf552b977b568"} err="failed to get container status \"20ed58b97ce05b8e03f3a565b7ba2a7bc2561cbe31c4d49a609cf552b977b568\": rpc error: code = NotFound desc = could not find container \"20ed58b97ce05b8e03f3a565b7ba2a7bc2561cbe31c4d49a609cf552b977b568\": container with ID starting with 20ed58b97ce05b8e03f3a565b7ba2a7bc2561cbe31c4d49a609cf552b977b568 not found: ID does not exist" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.421025 4760 scope.go:117] "RemoveContainer" containerID="9033f067ba6698c37fd74b64f06a530dad8c5baae6004962025e2030248511d8" Sep 30 07:51:23 crc kubenswrapper[4760]: E0930 07:51:23.421443 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9033f067ba6698c37fd74b64f06a530dad8c5baae6004962025e2030248511d8\": container with ID starting with 9033f067ba6698c37fd74b64f06a530dad8c5baae6004962025e2030248511d8 not found: ID does not exist" containerID="9033f067ba6698c37fd74b64f06a530dad8c5baae6004962025e2030248511d8" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.421480 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9033f067ba6698c37fd74b64f06a530dad8c5baae6004962025e2030248511d8"} err="failed to get container status \"9033f067ba6698c37fd74b64f06a530dad8c5baae6004962025e2030248511d8\": rpc error: code = NotFound desc = could not find container \"9033f067ba6698c37fd74b64f06a530dad8c5baae6004962025e2030248511d8\": container with ID starting with 9033f067ba6698c37fd74b64f06a530dad8c5baae6004962025e2030248511d8 not found: ID does not exist" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.421502 4760 scope.go:117] "RemoveContainer" containerID="d4afbc82bd27bf72ecee192b66554d47b2560bf5bd028bf7135ba8e44fb51690" Sep 30 07:51:23 crc kubenswrapper[4760]: E0930 07:51:23.423219 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4afbc82bd27bf72ecee192b66554d47b2560bf5bd028bf7135ba8e44fb51690\": container with ID starting with d4afbc82bd27bf72ecee192b66554d47b2560bf5bd028bf7135ba8e44fb51690 not found: ID does not exist" containerID="d4afbc82bd27bf72ecee192b66554d47b2560bf5bd028bf7135ba8e44fb51690" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.423241 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4afbc82bd27bf72ecee192b66554d47b2560bf5bd028bf7135ba8e44fb51690"} err="failed to get container status \"d4afbc82bd27bf72ecee192b66554d47b2560bf5bd028bf7135ba8e44fb51690\": rpc error: code = NotFound desc = could not find container \"d4afbc82bd27bf72ecee192b66554d47b2560bf5bd028bf7135ba8e44fb51690\": container with ID starting with d4afbc82bd27bf72ecee192b66554d47b2560bf5bd028bf7135ba8e44fb51690 not found: ID does not exist" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.428533 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bf1a6da-91bd-4965-b511-66774fa5d7d2-scripts\") pod \"ceilometer-0\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " pod="openstack/ceilometer-0" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.428598 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5bf1a6da-91bd-4965-b511-66774fa5d7d2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " pod="openstack/ceilometer-0" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.428657 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bf1a6da-91bd-4965-b511-66774fa5d7d2-config-data\") pod \"ceilometer-0\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " pod="openstack/ceilometer-0" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.428680 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf1a6da-91bd-4965-b511-66774fa5d7d2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " pod="openstack/ceilometer-0" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.428909 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bf1a6da-91bd-4965-b511-66774fa5d7d2-log-httpd\") pod \"ceilometer-0\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " pod="openstack/ceilometer-0" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.428971 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mmhm\" (UniqueName: \"kubernetes.io/projected/5bf1a6da-91bd-4965-b511-66774fa5d7d2-kube-api-access-2mmhm\") pod \"ceilometer-0\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " pod="openstack/ceilometer-0" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.429001 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bf1a6da-91bd-4965-b511-66774fa5d7d2-run-httpd\") pod \"ceilometer-0\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " pod="openstack/ceilometer-0" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.440635 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57c957c4ff-fl2rf" podUID="c8a2a11e-bb27-4ee4-8ff9-91a043e33f1c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.163:5353: i/o timeout" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.535906 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5bf1a6da-91bd-4965-b511-66774fa5d7d2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " pod="openstack/ceilometer-0" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.536275 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bf1a6da-91bd-4965-b511-66774fa5d7d2-config-data\") pod \"ceilometer-0\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " pod="openstack/ceilometer-0" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.536300 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf1a6da-91bd-4965-b511-66774fa5d7d2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " pod="openstack/ceilometer-0" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.536367 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bf1a6da-91bd-4965-b511-66774fa5d7d2-log-httpd\") pod \"ceilometer-0\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " pod="openstack/ceilometer-0" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.536394 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mmhm\" (UniqueName: \"kubernetes.io/projected/5bf1a6da-91bd-4965-b511-66774fa5d7d2-kube-api-access-2mmhm\") pod \"ceilometer-0\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " pod="openstack/ceilometer-0" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.536413 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bf1a6da-91bd-4965-b511-66774fa5d7d2-run-httpd\") pod \"ceilometer-0\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " pod="openstack/ceilometer-0" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.536516 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bf1a6da-91bd-4965-b511-66774fa5d7d2-scripts\") pod \"ceilometer-0\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " pod="openstack/ceilometer-0" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.537224 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bf1a6da-91bd-4965-b511-66774fa5d7d2-log-httpd\") pod \"ceilometer-0\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " pod="openstack/ceilometer-0" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.537328 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bf1a6da-91bd-4965-b511-66774fa5d7d2-run-httpd\") pod \"ceilometer-0\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " pod="openstack/ceilometer-0" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.539845 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5bf1a6da-91bd-4965-b511-66774fa5d7d2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " pod="openstack/ceilometer-0" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.542262 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bf1a6da-91bd-4965-b511-66774fa5d7d2-config-data\") pod \"ceilometer-0\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " pod="openstack/ceilometer-0" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.551373 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf1a6da-91bd-4965-b511-66774fa5d7d2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " pod="openstack/ceilometer-0" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.552008 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bf1a6da-91bd-4965-b511-66774fa5d7d2-scripts\") pod \"ceilometer-0\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " pod="openstack/ceilometer-0" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.558858 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mmhm\" (UniqueName: \"kubernetes.io/projected/5bf1a6da-91bd-4965-b511-66774fa5d7d2-kube-api-access-2mmhm\") pod \"ceilometer-0\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " pod="openstack/ceilometer-0" Sep 30 07:51:23 crc kubenswrapper[4760]: I0930 07:51:23.699913 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:51:24 crc kubenswrapper[4760]: I0930 07:51:24.031732 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6cc97c56c5-7pkjn"] Sep 30 07:51:24 crc kubenswrapper[4760]: I0930 07:51:24.205445 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:51:24 crc kubenswrapper[4760]: I0930 07:51:24.297464 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bf1a6da-91bd-4965-b511-66774fa5d7d2","Type":"ContainerStarted","Data":"1aa70055ea804092a856344d9217b563e14f025fe4cab21f562e4864e6c2b6d9"} Sep 30 07:51:24 crc kubenswrapper[4760]: I0930 07:51:24.299312 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6cc97c56c5-7pkjn" event={"ID":"fc788440-e748-4b41-bdb6-23a6764062fd","Type":"ContainerStarted","Data":"8283596985b774e772d5a4afd502f97ed8d3b49a502821c0afcb83d84b28fb88"} Sep 30 07:51:24 crc kubenswrapper[4760]: I0930 07:51:24.715807 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-694db87c64-qrwhp" podUID="e28d9015-ea18-4da0-bfd8-c2cc5dec4f37" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.157:8443: connect: connection refused" Sep 30 07:51:25 crc kubenswrapper[4760]: I0930 07:51:25.076824 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a555f66-7027-4fac-afcc-db7b3f5ae034" path="/var/lib/kubelet/pods/7a555f66-7027-4fac-afcc-db7b3f5ae034/volumes" Sep 30 07:51:25 crc kubenswrapper[4760]: I0930 07:51:25.211988 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7c45496686-rqmqg" podUID="8adf3e3f-bdc8-4a02-90da-71e21926d9f9" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.173:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 07:51:25 crc kubenswrapper[4760]: I0930 07:51:25.212065 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7c45496686-rqmqg" podUID="8adf3e3f-bdc8-4a02-90da-71e21926d9f9" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.173:9311/healthcheck\": dial tcp 10.217.0.173:9311: i/o timeout (Client.Timeout exceeded while awaiting headers)" Sep 30 07:51:25 crc kubenswrapper[4760]: I0930 07:51:25.311212 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6cc97c56c5-7pkjn" event={"ID":"fc788440-e748-4b41-bdb6-23a6764062fd","Type":"ContainerStarted","Data":"25c602677caf67ca6d5e6b79943c8bf89d9c37e3aebbb0e2d417eaaefa614bac"} Sep 30 07:51:25 crc kubenswrapper[4760]: I0930 07:51:25.311253 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6cc97c56c5-7pkjn" event={"ID":"fc788440-e748-4b41-bdb6-23a6764062fd","Type":"ContainerStarted","Data":"a7a70160ebdccb4780f46118c4ef395e1f547837fc9972742a9269482253b89e"} Sep 30 07:51:25 crc kubenswrapper[4760]: I0930 07:51:25.312182 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:25 crc kubenswrapper[4760]: I0930 07:51:25.312332 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:25 crc kubenswrapper[4760]: I0930 07:51:25.318136 4760 generic.go:334] "Generic (PLEG): container finished" podID="4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0" containerID="da79e51b47f98778727a550fc3e1167b95d280a9cb76165601bb82b5c73cf3f9" exitCode=0 Sep 30 07:51:25 crc kubenswrapper[4760]: I0930 07:51:25.318204 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9ml2s" event={"ID":"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0","Type":"ContainerDied","Data":"da79e51b47f98778727a550fc3e1167b95d280a9cb76165601bb82b5c73cf3f9"} Sep 30 07:51:25 crc kubenswrapper[4760]: I0930 07:51:25.325709 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bf1a6da-91bd-4965-b511-66774fa5d7d2","Type":"ContainerStarted","Data":"6e6daf0b99936ad2a997f298b22fd0469ba8a863b548504377c4fd353a10ea7e"} Sep 30 07:51:25 crc kubenswrapper[4760]: I0930 07:51:25.338105 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6cc97c56c5-7pkjn" podStartSLOduration=2.338086322 podStartE2EDuration="2.338086322s" podCreationTimestamp="2025-09-30 07:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:51:25.332551991 +0000 UTC m=+1070.975458403" watchObservedRunningTime="2025-09-30 07:51:25.338086322 +0000 UTC m=+1070.980992734" Sep 30 07:51:26 crc kubenswrapper[4760]: I0930 07:51:26.336291 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bf1a6da-91bd-4965-b511-66774fa5d7d2","Type":"ContainerStarted","Data":"1f9df15fbf107b05d5c67eee6f78fc4edab2db34feca48802226d8a4ed835b25"} Sep 30 07:51:26 crc kubenswrapper[4760]: I0930 07:51:26.546770 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:51:30 crc kubenswrapper[4760]: I0930 07:51:30.485142 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-fsvjc"] Sep 30 07:51:30 crc kubenswrapper[4760]: I0930 07:51:30.487620 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fsvjc" Sep 30 07:51:30 crc kubenswrapper[4760]: I0930 07:51:30.501097 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-fsvjc"] Sep 30 07:51:30 crc kubenswrapper[4760]: I0930 07:51:30.606565 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlvsw\" (UniqueName: \"kubernetes.io/projected/51b8af4a-eb0e-48a7-885d-917c60d526d3-kube-api-access-dlvsw\") pod \"nova-api-db-create-fsvjc\" (UID: \"51b8af4a-eb0e-48a7-885d-917c60d526d3\") " pod="openstack/nova-api-db-create-fsvjc" Sep 30 07:51:30 crc kubenswrapper[4760]: I0930 07:51:30.685820 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-6cjrj"] Sep 30 07:51:30 crc kubenswrapper[4760]: I0930 07:51:30.687096 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6cjrj" Sep 30 07:51:30 crc kubenswrapper[4760]: I0930 07:51:30.698403 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-6cjrj"] Sep 30 07:51:30 crc kubenswrapper[4760]: I0930 07:51:30.707915 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlvsw\" (UniqueName: \"kubernetes.io/projected/51b8af4a-eb0e-48a7-885d-917c60d526d3-kube-api-access-dlvsw\") pod \"nova-api-db-create-fsvjc\" (UID: \"51b8af4a-eb0e-48a7-885d-917c60d526d3\") " pod="openstack/nova-api-db-create-fsvjc" Sep 30 07:51:30 crc kubenswrapper[4760]: I0930 07:51:30.726860 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlvsw\" (UniqueName: \"kubernetes.io/projected/51b8af4a-eb0e-48a7-885d-917c60d526d3-kube-api-access-dlvsw\") pod \"nova-api-db-create-fsvjc\" (UID: \"51b8af4a-eb0e-48a7-885d-917c60d526d3\") " pod="openstack/nova-api-db-create-fsvjc" Sep 30 07:51:30 crc kubenswrapper[4760]: I0930 07:51:30.799208 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-cs5kx"] Sep 30 07:51:30 crc kubenswrapper[4760]: I0930 07:51:30.800763 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-cs5kx" Sep 30 07:51:30 crc kubenswrapper[4760]: I0930 07:51:30.809823 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95x2r\" (UniqueName: \"kubernetes.io/projected/527f2de8-9fbb-4ecb-893d-ffe0db4d524b-kube-api-access-95x2r\") pod \"nova-cell0-db-create-6cjrj\" (UID: \"527f2de8-9fbb-4ecb-893d-ffe0db4d524b\") " pod="openstack/nova-cell0-db-create-6cjrj" Sep 30 07:51:30 crc kubenswrapper[4760]: I0930 07:51:30.813869 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-cs5kx"] Sep 30 07:51:30 crc kubenswrapper[4760]: I0930 07:51:30.818872 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fsvjc" Sep 30 07:51:30 crc kubenswrapper[4760]: I0930 07:51:30.911452 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-787qx\" (UniqueName: \"kubernetes.io/projected/c3eab693-7a8a-4ad2-a247-b8a79f178a87-kube-api-access-787qx\") pod \"nova-cell1-db-create-cs5kx\" (UID: \"c3eab693-7a8a-4ad2-a247-b8a79f178a87\") " pod="openstack/nova-cell1-db-create-cs5kx" Sep 30 07:51:30 crc kubenswrapper[4760]: I0930 07:51:30.911531 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95x2r\" (UniqueName: \"kubernetes.io/projected/527f2de8-9fbb-4ecb-893d-ffe0db4d524b-kube-api-access-95x2r\") pod \"nova-cell0-db-create-6cjrj\" (UID: \"527f2de8-9fbb-4ecb-893d-ffe0db4d524b\") " pod="openstack/nova-cell0-db-create-6cjrj" Sep 30 07:51:30 crc kubenswrapper[4760]: I0930 07:51:30.938764 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95x2r\" (UniqueName: \"kubernetes.io/projected/527f2de8-9fbb-4ecb-893d-ffe0db4d524b-kube-api-access-95x2r\") pod \"nova-cell0-db-create-6cjrj\" (UID: \"527f2de8-9fbb-4ecb-893d-ffe0db4d524b\") " pod="openstack/nova-cell0-db-create-6cjrj" Sep 30 07:51:31 crc kubenswrapper[4760]: I0930 07:51:31.012971 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-787qx\" (UniqueName: \"kubernetes.io/projected/c3eab693-7a8a-4ad2-a247-b8a79f178a87-kube-api-access-787qx\") pod \"nova-cell1-db-create-cs5kx\" (UID: \"c3eab693-7a8a-4ad2-a247-b8a79f178a87\") " pod="openstack/nova-cell1-db-create-cs5kx" Sep 30 07:51:31 crc kubenswrapper[4760]: I0930 07:51:31.030652 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-787qx\" (UniqueName: \"kubernetes.io/projected/c3eab693-7a8a-4ad2-a247-b8a79f178a87-kube-api-access-787qx\") pod \"nova-cell1-db-create-cs5kx\" (UID: \"c3eab693-7a8a-4ad2-a247-b8a79f178a87\") " pod="openstack/nova-cell1-db-create-cs5kx" Sep 30 07:51:31 crc kubenswrapper[4760]: I0930 07:51:31.094705 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6cjrj" Sep 30 07:51:31 crc kubenswrapper[4760]: I0930 07:51:31.126865 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-cs5kx" Sep 30 07:51:32 crc kubenswrapper[4760]: I0930 07:51:32.322409 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9ml2s" Sep 30 07:51:32 crc kubenswrapper[4760]: I0930 07:51:32.400256 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9ml2s" event={"ID":"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0","Type":"ContainerDied","Data":"3a0fec3394842434671bbab30568c21e245a9fc2c3df39d9eb043f8fd3a8591b"} Sep 30 07:51:32 crc kubenswrapper[4760]: I0930 07:51:32.400290 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a0fec3394842434671bbab30568c21e245a9fc2c3df39d9eb043f8fd3a8591b" Sep 30 07:51:32 crc kubenswrapper[4760]: I0930 07:51:32.400381 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9ml2s" Sep 30 07:51:32 crc kubenswrapper[4760]: I0930 07:51:32.443513 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-config-data\") pod \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\" (UID: \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\") " Sep 30 07:51:32 crc kubenswrapper[4760]: I0930 07:51:32.443586 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tz96\" (UniqueName: \"kubernetes.io/projected/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-kube-api-access-5tz96\") pod \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\" (UID: \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\") " Sep 30 07:51:32 crc kubenswrapper[4760]: I0930 07:51:32.443637 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-etc-machine-id\") pod \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\" (UID: \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\") " Sep 30 07:51:32 crc kubenswrapper[4760]: I0930 07:51:32.443694 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-db-sync-config-data\") pod \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\" (UID: \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\") " Sep 30 07:51:32 crc kubenswrapper[4760]: I0930 07:51:32.443767 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-combined-ca-bundle\") pod \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\" (UID: \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\") " Sep 30 07:51:32 crc kubenswrapper[4760]: I0930 07:51:32.443795 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-scripts\") pod \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\" (UID: \"4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0\") " Sep 30 07:51:32 crc kubenswrapper[4760]: I0930 07:51:32.443851 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0" (UID: "4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:51:32 crc kubenswrapper[4760]: I0930 07:51:32.444205 4760 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:32 crc kubenswrapper[4760]: I0930 07:51:32.450455 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-kube-api-access-5tz96" (OuterVolumeSpecName: "kube-api-access-5tz96") pod "4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0" (UID: "4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0"). InnerVolumeSpecName "kube-api-access-5tz96". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:51:32 crc kubenswrapper[4760]: I0930 07:51:32.453067 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0" (UID: "4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:32 crc kubenswrapper[4760]: I0930 07:51:32.458011 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-scripts" (OuterVolumeSpecName: "scripts") pod "4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0" (UID: "4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:32 crc kubenswrapper[4760]: I0930 07:51:32.506209 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0" (UID: "4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:32 crc kubenswrapper[4760]: I0930 07:51:32.524167 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-config-data" (OuterVolumeSpecName: "config-data") pod "4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0" (UID: "4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:32 crc kubenswrapper[4760]: I0930 07:51:32.546206 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:32 crc kubenswrapper[4760]: I0930 07:51:32.546251 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tz96\" (UniqueName: \"kubernetes.io/projected/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-kube-api-access-5tz96\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:32 crc kubenswrapper[4760]: I0930 07:51:32.546266 4760 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:32 crc kubenswrapper[4760]: I0930 07:51:32.546278 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:32 crc kubenswrapper[4760]: I0930 07:51:32.546292 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:32 crc kubenswrapper[4760]: I0930 07:51:32.607638 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-6cjrj"] Sep 30 07:51:32 crc kubenswrapper[4760]: W0930 07:51:32.612783 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod527f2de8_9fbb_4ecb_893d_ffe0db4d524b.slice/crio-289dc1d8263b512b8b39d000a5d4cc82e237bbab1b1d41a5f0354b33f45efc42 WatchSource:0}: Error finding container 289dc1d8263b512b8b39d000a5d4cc82e237bbab1b1d41a5f0354b33f45efc42: Status 404 returned error can't find the container with id 289dc1d8263b512b8b39d000a5d4cc82e237bbab1b1d41a5f0354b33f45efc42 Sep 30 07:51:32 crc kubenswrapper[4760]: I0930 07:51:32.623013 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-cs5kx"] Sep 30 07:51:32 crc kubenswrapper[4760]: W0930 07:51:32.625682 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3eab693_7a8a_4ad2_a247_b8a79f178a87.slice/crio-b581a0d941e092d5bd8cd078a2857c5174cde931d1530c0fc5c968537e3b71d0 WatchSource:0}: Error finding container b581a0d941e092d5bd8cd078a2857c5174cde931d1530c0fc5c968537e3b71d0: Status 404 returned error can't find the container with id b581a0d941e092d5bd8cd078a2857c5174cde931d1530c0fc5c968537e3b71d0 Sep 30 07:51:32 crc kubenswrapper[4760]: I0930 07:51:32.758054 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-fsvjc"] Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.354690 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.355022 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6cc97c56c5-7pkjn" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.414022 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"1c5867a3-c734-489e-a6b3-edb023949556","Type":"ContainerStarted","Data":"31e6a01802134d7cb6f28764adf888850592d8650d21fc32b1726e36e868f706"} Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.419049 4760 generic.go:334] "Generic (PLEG): container finished" podID="51b8af4a-eb0e-48a7-885d-917c60d526d3" containerID="53b92e2949ce33a9d4ae27e2e663dbf94e93e32a5e849263f5e67fb352e44bbc" exitCode=0 Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.419124 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fsvjc" event={"ID":"51b8af4a-eb0e-48a7-885d-917c60d526d3","Type":"ContainerDied","Data":"53b92e2949ce33a9d4ae27e2e663dbf94e93e32a5e849263f5e67fb352e44bbc"} Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.419146 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fsvjc" event={"ID":"51b8af4a-eb0e-48a7-885d-917c60d526d3","Type":"ContainerStarted","Data":"ce5a203a5fe3313d00386aaafe8e3f7a706fdef3fe5f751297ca470911c99f7d"} Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.433099 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.508984677 podStartE2EDuration="14.433083253s" podCreationTimestamp="2025-09-30 07:51:19 +0000 UTC" firstStartedPulling="2025-09-30 07:51:20.192921447 +0000 UTC m=+1065.835827859" lastFinishedPulling="2025-09-30 07:51:32.117020023 +0000 UTC m=+1077.759926435" observedRunningTime="2025-09-30 07:51:33.432798436 +0000 UTC m=+1079.075704848" watchObservedRunningTime="2025-09-30 07:51:33.433083253 +0000 UTC m=+1079.075989665" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.439840 4760 generic.go:334] "Generic (PLEG): container finished" podID="c3eab693-7a8a-4ad2-a247-b8a79f178a87" containerID="c17a111d0562f8972f7218ff8da22c761885ce0d02658fc80f242203a4de9887" exitCode=0 Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.439927 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-cs5kx" event={"ID":"c3eab693-7a8a-4ad2-a247-b8a79f178a87","Type":"ContainerDied","Data":"c17a111d0562f8972f7218ff8da22c761885ce0d02658fc80f242203a4de9887"} Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.439951 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-cs5kx" event={"ID":"c3eab693-7a8a-4ad2-a247-b8a79f178a87","Type":"ContainerStarted","Data":"b581a0d941e092d5bd8cd078a2857c5174cde931d1530c0fc5c968537e3b71d0"} Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.447290 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bf1a6da-91bd-4965-b511-66774fa5d7d2","Type":"ContainerStarted","Data":"3cc78b8683e481aa50d7fca9ab14f0b7c4528df03e45a92633a2b6bf375cc553"} Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.455990 4760 generic.go:334] "Generic (PLEG): container finished" podID="527f2de8-9fbb-4ecb-893d-ffe0db4d524b" containerID="e7d14e2491ad5b89419c6c96eded977f0df2ff4374b049abb04dfed938177ab6" exitCode=0 Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.456039 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6cjrj" event={"ID":"527f2de8-9fbb-4ecb-893d-ffe0db4d524b","Type":"ContainerDied","Data":"e7d14e2491ad5b89419c6c96eded977f0df2ff4374b049abb04dfed938177ab6"} Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.456063 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6cjrj" event={"ID":"527f2de8-9fbb-4ecb-893d-ffe0db4d524b","Type":"ContainerStarted","Data":"289dc1d8263b512b8b39d000a5d4cc82e237bbab1b1d41a5f0354b33f45efc42"} Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.594672 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 07:51:33 crc kubenswrapper[4760]: E0930 07:51:33.595286 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0" containerName="cinder-db-sync" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.595329 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0" containerName="cinder-db-sync" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.595613 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0" containerName="cinder-db-sync" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.598018 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.603748 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-9sl2d" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.603789 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.603945 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.604047 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.619750 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.677930 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-pqzsz"] Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.678898 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-config-data\") pod \"cinder-scheduler-0\" (UID: \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.679085 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.679109 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.679142 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9gxf\" (UniqueName: \"kubernetes.io/projected/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-kube-api-access-b9gxf\") pod \"cinder-scheduler-0\" (UID: \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.679253 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.679288 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-scripts\") pod \"cinder-scheduler-0\" (UID: \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.680410 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.705966 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-pqzsz"] Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.781098 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.781511 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.781738 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9gxf\" (UniqueName: \"kubernetes.io/projected/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-kube-api-access-b9gxf\") pod \"cinder-scheduler-0\" (UID: \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.781785 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-pqzsz\" (UID: \"a40ada2e-c162-4cdc-a530-e6708fccae5c\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.781815 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-pqzsz\" (UID: \"a40ada2e-c162-4cdc-a530-e6708fccae5c\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.781872 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.781909 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-scripts\") pod \"cinder-scheduler-0\" (UID: \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.781979 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-pqzsz\" (UID: \"a40ada2e-c162-4cdc-a530-e6708fccae5c\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.782013 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d82gs\" (UniqueName: \"kubernetes.io/projected/a40ada2e-c162-4cdc-a530-e6708fccae5c-kube-api-access-d82gs\") pod \"dnsmasq-dns-6bb4fc677f-pqzsz\" (UID: \"a40ada2e-c162-4cdc-a530-e6708fccae5c\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.782061 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-pqzsz\" (UID: \"a40ada2e-c162-4cdc-a530-e6708fccae5c\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.782093 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-config\") pod \"dnsmasq-dns-6bb4fc677f-pqzsz\" (UID: \"a40ada2e-c162-4cdc-a530-e6708fccae5c\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.782123 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-config-data\") pod \"cinder-scheduler-0\" (UID: \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.784737 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.786313 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.786390 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.786769 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.787606 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-config-data\") pod \"cinder-scheduler-0\" (UID: \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.790717 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-scripts\") pod \"cinder-scheduler-0\" (UID: \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.794136 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.813450 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.831534 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9gxf\" (UniqueName: \"kubernetes.io/projected/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-kube-api-access-b9gxf\") pod \"cinder-scheduler-0\" (UID: \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.857126 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.886326 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40978ac5-870c-4273-8814-6c735435ca09-logs\") pod \"cinder-api-0\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " pod="openstack/cinder-api-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.886410 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-pqzsz\" (UID: \"a40ada2e-c162-4cdc-a530-e6708fccae5c\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.886431 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-pqzsz\" (UID: \"a40ada2e-c162-4cdc-a530-e6708fccae5c\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.886451 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wwpc\" (UniqueName: \"kubernetes.io/projected/40978ac5-870c-4273-8814-6c735435ca09-kube-api-access-9wwpc\") pod \"cinder-api-0\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " pod="openstack/cinder-api-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.886472 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/40978ac5-870c-4273-8814-6c735435ca09-etc-machine-id\") pod \"cinder-api-0\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " pod="openstack/cinder-api-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.886522 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-pqzsz\" (UID: \"a40ada2e-c162-4cdc-a530-e6708fccae5c\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.886543 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d82gs\" (UniqueName: \"kubernetes.io/projected/a40ada2e-c162-4cdc-a530-e6708fccae5c-kube-api-access-d82gs\") pod \"dnsmasq-dns-6bb4fc677f-pqzsz\" (UID: \"a40ada2e-c162-4cdc-a530-e6708fccae5c\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.886561 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40978ac5-870c-4273-8814-6c735435ca09-config-data\") pod \"cinder-api-0\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " pod="openstack/cinder-api-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.886580 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40978ac5-870c-4273-8814-6c735435ca09-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " pod="openstack/cinder-api-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.886600 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-pqzsz\" (UID: \"a40ada2e-c162-4cdc-a530-e6708fccae5c\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.886619 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-config\") pod \"dnsmasq-dns-6bb4fc677f-pqzsz\" (UID: \"a40ada2e-c162-4cdc-a530-e6708fccae5c\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.886653 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40978ac5-870c-4273-8814-6c735435ca09-config-data-custom\") pod \"cinder-api-0\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " pod="openstack/cinder-api-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.886669 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40978ac5-870c-4273-8814-6c735435ca09-scripts\") pod \"cinder-api-0\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " pod="openstack/cinder-api-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.887388 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-pqzsz\" (UID: \"a40ada2e-c162-4cdc-a530-e6708fccae5c\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.887504 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-pqzsz\" (UID: \"a40ada2e-c162-4cdc-a530-e6708fccae5c\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.887937 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-config\") pod \"dnsmasq-dns-6bb4fc677f-pqzsz\" (UID: \"a40ada2e-c162-4cdc-a530-e6708fccae5c\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.890220 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-pqzsz\" (UID: \"a40ada2e-c162-4cdc-a530-e6708fccae5c\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.890967 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-pqzsz\" (UID: \"a40ada2e-c162-4cdc-a530-e6708fccae5c\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.910932 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d82gs\" (UniqueName: \"kubernetes.io/projected/a40ada2e-c162-4cdc-a530-e6708fccae5c-kube-api-access-d82gs\") pod \"dnsmasq-dns-6bb4fc677f-pqzsz\" (UID: \"a40ada2e-c162-4cdc-a530-e6708fccae5c\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.922547 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.987871 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40978ac5-870c-4273-8814-6c735435ca09-logs\") pod \"cinder-api-0\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " pod="openstack/cinder-api-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.988024 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wwpc\" (UniqueName: \"kubernetes.io/projected/40978ac5-870c-4273-8814-6c735435ca09-kube-api-access-9wwpc\") pod \"cinder-api-0\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " pod="openstack/cinder-api-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.988052 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/40978ac5-870c-4273-8814-6c735435ca09-etc-machine-id\") pod \"cinder-api-0\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " pod="openstack/cinder-api-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.988147 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40978ac5-870c-4273-8814-6c735435ca09-config-data\") pod \"cinder-api-0\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " pod="openstack/cinder-api-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.988193 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40978ac5-870c-4273-8814-6c735435ca09-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " pod="openstack/cinder-api-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.988279 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40978ac5-870c-4273-8814-6c735435ca09-config-data-custom\") pod \"cinder-api-0\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " pod="openstack/cinder-api-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.988320 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40978ac5-870c-4273-8814-6c735435ca09-scripts\") pod \"cinder-api-0\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " pod="openstack/cinder-api-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.988965 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40978ac5-870c-4273-8814-6c735435ca09-logs\") pod \"cinder-api-0\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " pod="openstack/cinder-api-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.989612 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/40978ac5-870c-4273-8814-6c735435ca09-etc-machine-id\") pod \"cinder-api-0\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " pod="openstack/cinder-api-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.993660 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40978ac5-870c-4273-8814-6c735435ca09-scripts\") pod \"cinder-api-0\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " pod="openstack/cinder-api-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.993879 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40978ac5-870c-4273-8814-6c735435ca09-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " pod="openstack/cinder-api-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.998193 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40978ac5-870c-4273-8814-6c735435ca09-config-data\") pod \"cinder-api-0\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " pod="openstack/cinder-api-0" Sep 30 07:51:33 crc kubenswrapper[4760]: I0930 07:51:33.998683 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40978ac5-870c-4273-8814-6c735435ca09-config-data-custom\") pod \"cinder-api-0\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " pod="openstack/cinder-api-0" Sep 30 07:51:34 crc kubenswrapper[4760]: I0930 07:51:34.008971 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wwpc\" (UniqueName: \"kubernetes.io/projected/40978ac5-870c-4273-8814-6c735435ca09-kube-api-access-9wwpc\") pod \"cinder-api-0\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " pod="openstack/cinder-api-0" Sep 30 07:51:34 crc kubenswrapper[4760]: I0930 07:51:34.181647 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" Sep 30 07:51:34 crc kubenswrapper[4760]: I0930 07:51:34.216868 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 07:51:34 crc kubenswrapper[4760]: I0930 07:51:34.474613 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5bf1a6da-91bd-4965-b511-66774fa5d7d2" containerName="ceilometer-central-agent" containerID="cri-o://6e6daf0b99936ad2a997f298b22fd0469ba8a863b548504377c4fd353a10ea7e" gracePeriod=30 Sep 30 07:51:34 crc kubenswrapper[4760]: I0930 07:51:34.474699 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5bf1a6da-91bd-4965-b511-66774fa5d7d2" containerName="proxy-httpd" containerID="cri-o://6f24997f6d9d281f8730a205bd92e7b233a5835bc2fe925b93498d5b85784638" gracePeriod=30 Sep 30 07:51:34 crc kubenswrapper[4760]: I0930 07:51:34.474732 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5bf1a6da-91bd-4965-b511-66774fa5d7d2" containerName="sg-core" containerID="cri-o://3cc78b8683e481aa50d7fca9ab14f0b7c4528df03e45a92633a2b6bf375cc553" gracePeriod=30 Sep 30 07:51:34 crc kubenswrapper[4760]: I0930 07:51:34.474761 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5bf1a6da-91bd-4965-b511-66774fa5d7d2" containerName="ceilometer-notification-agent" containerID="cri-o://1f9df15fbf107b05d5c67eee6f78fc4edab2db34feca48802226d8a4ed835b25" gracePeriod=30 Sep 30 07:51:34 crc kubenswrapper[4760]: I0930 07:51:34.475014 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bf1a6da-91bd-4965-b511-66774fa5d7d2","Type":"ContainerStarted","Data":"6f24997f6d9d281f8730a205bd92e7b233a5835bc2fe925b93498d5b85784638"} Sep 30 07:51:34 crc kubenswrapper[4760]: I0930 07:51:34.481964 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 07:51:34 crc kubenswrapper[4760]: W0930 07:51:34.498404 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c13e0bb_80b7_4846_b26c_5c60b234d6fa.slice/crio-fe2eccf082a658deeb7f89306c40e01fb0ae73a5504dff438b4b149a18384cd7 WatchSource:0}: Error finding container fe2eccf082a658deeb7f89306c40e01fb0ae73a5504dff438b4b149a18384cd7: Status 404 returned error can't find the container with id fe2eccf082a658deeb7f89306c40e01fb0ae73a5504dff438b4b149a18384cd7 Sep 30 07:51:34 crc kubenswrapper[4760]: I0930 07:51:34.534096 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.133764958 podStartE2EDuration="11.534064434s" podCreationTimestamp="2025-09-30 07:51:23 +0000 UTC" firstStartedPulling="2025-09-30 07:51:24.21147359 +0000 UTC m=+1069.854380002" lastFinishedPulling="2025-09-30 07:51:33.611773066 +0000 UTC m=+1079.254679478" observedRunningTime="2025-09-30 07:51:34.503653119 +0000 UTC m=+1080.146559531" watchObservedRunningTime="2025-09-30 07:51:34.534064434 +0000 UTC m=+1080.176970846" Sep 30 07:51:34 crc kubenswrapper[4760]: I0930 07:51:34.647361 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-79bccb96b8-8wjx5" Sep 30 07:51:34 crc kubenswrapper[4760]: I0930 07:51:34.718845 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-694db87c64-qrwhp" podUID="e28d9015-ea18-4da0-bfd8-c2cc5dec4f37" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.157:8443: connect: connection refused" Sep 30 07:51:34 crc kubenswrapper[4760]: W0930 07:51:34.862528 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda40ada2e_c162_4cdc_a530_e6708fccae5c.slice/crio-39125463dc37a32d1c1e163c1517ee202542ce0b2382d73fa449f64e0a292794 WatchSource:0}: Error finding container 39125463dc37a32d1c1e163c1517ee202542ce0b2382d73fa449f64e0a292794: Status 404 returned error can't find the container with id 39125463dc37a32d1c1e163c1517ee202542ce0b2382d73fa449f64e0a292794 Sep 30 07:51:34 crc kubenswrapper[4760]: I0930 07:51:34.862765 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-pqzsz"] Sep 30 07:51:35 crc kubenswrapper[4760]: E0930 07:51:35.038355 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bf1a6da_91bd_4965_b511_66774fa5d7d2.slice/crio-1f9df15fbf107b05d5c67eee6f78fc4edab2db34feca48802226d8a4ed835b25.scope\": RecentStats: unable to find data in memory cache]" Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.430102 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 07:51:35 crc kubenswrapper[4760]: W0930 07:51:35.432826 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40978ac5_870c_4273_8814_6c735435ca09.slice/crio-4807dcc5098f4e4f8b16f109246c0e564c9f5bc46bb6a764dc45c34b7a0beec8 WatchSource:0}: Error finding container 4807dcc5098f4e4f8b16f109246c0e564c9f5bc46bb6a764dc45c34b7a0beec8: Status 404 returned error can't find the container with id 4807dcc5098f4e4f8b16f109246c0e564c9f5bc46bb6a764dc45c34b7a0beec8 Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.507429 4760 generic.go:334] "Generic (PLEG): container finished" podID="a40ada2e-c162-4cdc-a530-e6708fccae5c" containerID="f5a769dfc86cc49adfc7c5088d3d6f188ba956f8fbb6f1ce51f35650c51b3d4a" exitCode=0 Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.507866 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" event={"ID":"a40ada2e-c162-4cdc-a530-e6708fccae5c","Type":"ContainerDied","Data":"f5a769dfc86cc49adfc7c5088d3d6f188ba956f8fbb6f1ce51f35650c51b3d4a"} Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.507916 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" event={"ID":"a40ada2e-c162-4cdc-a530-e6708fccae5c","Type":"ContainerStarted","Data":"39125463dc37a32d1c1e163c1517ee202542ce0b2382d73fa449f64e0a292794"} Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.514968 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fsvjc" event={"ID":"51b8af4a-eb0e-48a7-885d-917c60d526d3","Type":"ContainerDied","Data":"ce5a203a5fe3313d00386aaafe8e3f7a706fdef3fe5f751297ca470911c99f7d"} Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.515007 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce5a203a5fe3313d00386aaafe8e3f7a706fdef3fe5f751297ca470911c99f7d" Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.516038 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6cjrj" Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.546937 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-cs5kx" event={"ID":"c3eab693-7a8a-4ad2-a247-b8a79f178a87","Type":"ContainerDied","Data":"b581a0d941e092d5bd8cd078a2857c5174cde931d1530c0fc5c968537e3b71d0"} Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.546977 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b581a0d941e092d5bd8cd078a2857c5174cde931d1530c0fc5c968537e3b71d0" Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.547555 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-cs5kx" Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.552390 4760 generic.go:334] "Generic (PLEG): container finished" podID="5bf1a6da-91bd-4965-b511-66774fa5d7d2" containerID="6f24997f6d9d281f8730a205bd92e7b233a5835bc2fe925b93498d5b85784638" exitCode=0 Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.552415 4760 generic.go:334] "Generic (PLEG): container finished" podID="5bf1a6da-91bd-4965-b511-66774fa5d7d2" containerID="3cc78b8683e481aa50d7fca9ab14f0b7c4528df03e45a92633a2b6bf375cc553" exitCode=2 Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.552424 4760 generic.go:334] "Generic (PLEG): container finished" podID="5bf1a6da-91bd-4965-b511-66774fa5d7d2" containerID="1f9df15fbf107b05d5c67eee6f78fc4edab2db34feca48802226d8a4ed835b25" exitCode=0 Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.552432 4760 generic.go:334] "Generic (PLEG): container finished" podID="5bf1a6da-91bd-4965-b511-66774fa5d7d2" containerID="6e6daf0b99936ad2a997f298b22fd0469ba8a863b548504377c4fd353a10ea7e" exitCode=0 Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.552471 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bf1a6da-91bd-4965-b511-66774fa5d7d2","Type":"ContainerDied","Data":"6f24997f6d9d281f8730a205bd92e7b233a5835bc2fe925b93498d5b85784638"} Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.552496 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bf1a6da-91bd-4965-b511-66774fa5d7d2","Type":"ContainerDied","Data":"3cc78b8683e481aa50d7fca9ab14f0b7c4528df03e45a92633a2b6bf375cc553"} Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.552509 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bf1a6da-91bd-4965-b511-66774fa5d7d2","Type":"ContainerDied","Data":"1f9df15fbf107b05d5c67eee6f78fc4edab2db34feca48802226d8a4ed835b25"} Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.552517 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bf1a6da-91bd-4965-b511-66774fa5d7d2","Type":"ContainerDied","Data":"6e6daf0b99936ad2a997f298b22fd0469ba8a863b548504377c4fd353a10ea7e"} Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.560839 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6cjrj" event={"ID":"527f2de8-9fbb-4ecb-893d-ffe0db4d524b","Type":"ContainerDied","Data":"289dc1d8263b512b8b39d000a5d4cc82e237bbab1b1d41a5f0354b33f45efc42"} Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.560874 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="289dc1d8263b512b8b39d000a5d4cc82e237bbab1b1d41a5f0354b33f45efc42" Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.560929 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6cjrj" Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.605507 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"40978ac5-870c-4273-8814-6c735435ca09","Type":"ContainerStarted","Data":"4807dcc5098f4e4f8b16f109246c0e564c9f5bc46bb6a764dc45c34b7a0beec8"} Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.609147 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5c13e0bb-80b7-4846-b26c-5c60b234d6fa","Type":"ContainerStarted","Data":"fe2eccf082a658deeb7f89306c40e01fb0ae73a5504dff438b4b149a18384cd7"} Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.618275 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fsvjc" Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.643020 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95x2r\" (UniqueName: \"kubernetes.io/projected/527f2de8-9fbb-4ecb-893d-ffe0db4d524b-kube-api-access-95x2r\") pod \"527f2de8-9fbb-4ecb-893d-ffe0db4d524b\" (UID: \"527f2de8-9fbb-4ecb-893d-ffe0db4d524b\") " Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.643224 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-787qx\" (UniqueName: \"kubernetes.io/projected/c3eab693-7a8a-4ad2-a247-b8a79f178a87-kube-api-access-787qx\") pod \"c3eab693-7a8a-4ad2-a247-b8a79f178a87\" (UID: \"c3eab693-7a8a-4ad2-a247-b8a79f178a87\") " Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.650008 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3eab693-7a8a-4ad2-a247-b8a79f178a87-kube-api-access-787qx" (OuterVolumeSpecName: "kube-api-access-787qx") pod "c3eab693-7a8a-4ad2-a247-b8a79f178a87" (UID: "c3eab693-7a8a-4ad2-a247-b8a79f178a87"). InnerVolumeSpecName "kube-api-access-787qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.651602 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/527f2de8-9fbb-4ecb-893d-ffe0db4d524b-kube-api-access-95x2r" (OuterVolumeSpecName: "kube-api-access-95x2r") pod "527f2de8-9fbb-4ecb-893d-ffe0db4d524b" (UID: "527f2de8-9fbb-4ecb-893d-ffe0db4d524b"). InnerVolumeSpecName "kube-api-access-95x2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.697482 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.746118 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlvsw\" (UniqueName: \"kubernetes.io/projected/51b8af4a-eb0e-48a7-885d-917c60d526d3-kube-api-access-dlvsw\") pod \"51b8af4a-eb0e-48a7-885d-917c60d526d3\" (UID: \"51b8af4a-eb0e-48a7-885d-917c60d526d3\") " Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.749006 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95x2r\" (UniqueName: \"kubernetes.io/projected/527f2de8-9fbb-4ecb-893d-ffe0db4d524b-kube-api-access-95x2r\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.749032 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-787qx\" (UniqueName: \"kubernetes.io/projected/c3eab693-7a8a-4ad2-a247-b8a79f178a87-kube-api-access-787qx\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.752509 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51b8af4a-eb0e-48a7-885d-917c60d526d3-kube-api-access-dlvsw" (OuterVolumeSpecName: "kube-api-access-dlvsw") pod "51b8af4a-eb0e-48a7-885d-917c60d526d3" (UID: "51b8af4a-eb0e-48a7-885d-917c60d526d3"). InnerVolumeSpecName "kube-api-access-dlvsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.836505 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.850062 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bf1a6da-91bd-4965-b511-66774fa5d7d2-run-httpd\") pod \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.850189 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mmhm\" (UniqueName: \"kubernetes.io/projected/5bf1a6da-91bd-4965-b511-66774fa5d7d2-kube-api-access-2mmhm\") pod \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.850277 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bf1a6da-91bd-4965-b511-66774fa5d7d2-log-httpd\") pod \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.850325 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bf1a6da-91bd-4965-b511-66774fa5d7d2-scripts\") pod \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.850349 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bf1a6da-91bd-4965-b511-66774fa5d7d2-config-data\") pod \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.850406 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5bf1a6da-91bd-4965-b511-66774fa5d7d2-sg-core-conf-yaml\") pod \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.850442 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf1a6da-91bd-4965-b511-66774fa5d7d2-combined-ca-bundle\") pod \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\" (UID: \"5bf1a6da-91bd-4965-b511-66774fa5d7d2\") " Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.850816 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlvsw\" (UniqueName: \"kubernetes.io/projected/51b8af4a-eb0e-48a7-885d-917c60d526d3-kube-api-access-dlvsw\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.851500 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bf1a6da-91bd-4965-b511-66774fa5d7d2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5bf1a6da-91bd-4965-b511-66774fa5d7d2" (UID: "5bf1a6da-91bd-4965-b511-66774fa5d7d2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.853861 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bf1a6da-91bd-4965-b511-66774fa5d7d2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5bf1a6da-91bd-4965-b511-66774fa5d7d2" (UID: "5bf1a6da-91bd-4965-b511-66774fa5d7d2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.856180 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bf1a6da-91bd-4965-b511-66774fa5d7d2-scripts" (OuterVolumeSpecName: "scripts") pod "5bf1a6da-91bd-4965-b511-66774fa5d7d2" (UID: "5bf1a6da-91bd-4965-b511-66774fa5d7d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.863374 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bf1a6da-91bd-4965-b511-66774fa5d7d2-kube-api-access-2mmhm" (OuterVolumeSpecName: "kube-api-access-2mmhm") pod "5bf1a6da-91bd-4965-b511-66774fa5d7d2" (UID: "5bf1a6da-91bd-4965-b511-66774fa5d7d2"). InnerVolumeSpecName "kube-api-access-2mmhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.920825 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bf1a6da-91bd-4965-b511-66774fa5d7d2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5bf1a6da-91bd-4965-b511-66774fa5d7d2" (UID: "5bf1a6da-91bd-4965-b511-66774fa5d7d2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.955962 4760 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bf1a6da-91bd-4965-b511-66774fa5d7d2-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.955991 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mmhm\" (UniqueName: \"kubernetes.io/projected/5bf1a6da-91bd-4965-b511-66774fa5d7d2-kube-api-access-2mmhm\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.956003 4760 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bf1a6da-91bd-4965-b511-66774fa5d7d2-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.956010 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bf1a6da-91bd-4965-b511-66774fa5d7d2-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:35 crc kubenswrapper[4760]: I0930 07:51:35.956019 4760 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5bf1a6da-91bd-4965-b511-66774fa5d7d2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.031213 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bf1a6da-91bd-4965-b511-66774fa5d7d2-config-data" (OuterVolumeSpecName: "config-data") pod "5bf1a6da-91bd-4965-b511-66774fa5d7d2" (UID: "5bf1a6da-91bd-4965-b511-66774fa5d7d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.045812 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bf1a6da-91bd-4965-b511-66774fa5d7d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5bf1a6da-91bd-4965-b511-66774fa5d7d2" (UID: "5bf1a6da-91bd-4965-b511-66774fa5d7d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.076510 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bf1a6da-91bd-4965-b511-66774fa5d7d2-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.076536 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf1a6da-91bd-4965-b511-66774fa5d7d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.656189 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5c13e0bb-80b7-4846-b26c-5c60b234d6fa","Type":"ContainerStarted","Data":"b60c617b36bf4b007f796291f01c1bdc27578b9cf8b6758ecc2d7f62076b4a43"} Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.661384 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" event={"ID":"a40ada2e-c162-4cdc-a530-e6708fccae5c","Type":"ContainerStarted","Data":"b531b3ff6b4f7f017c89b9fc274d05538564829cf62fdd550e88eca675b3088f"} Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.661487 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.671591 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-cs5kx" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.671697 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bf1a6da-91bd-4965-b511-66774fa5d7d2","Type":"ContainerDied","Data":"1aa70055ea804092a856344d9217b563e14f025fe4cab21f562e4864e6c2b6d9"} Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.671769 4760 scope.go:117] "RemoveContainer" containerID="6f24997f6d9d281f8730a205bd92e7b233a5835bc2fe925b93498d5b85784638" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.671808 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fsvjc" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.671916 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.702524 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" podStartSLOduration=3.70250281 podStartE2EDuration="3.70250281s" podCreationTimestamp="2025-09-30 07:51:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:51:36.68444405 +0000 UTC m=+1082.327350462" watchObservedRunningTime="2025-09-30 07:51:36.70250281 +0000 UTC m=+1082.345409222" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.756380 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.770405 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.787259 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:51:36 crc kubenswrapper[4760]: E0930 07:51:36.788023 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51b8af4a-eb0e-48a7-885d-917c60d526d3" containerName="mariadb-database-create" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.788041 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="51b8af4a-eb0e-48a7-885d-917c60d526d3" containerName="mariadb-database-create" Sep 30 07:51:36 crc kubenswrapper[4760]: E0930 07:51:36.788183 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bf1a6da-91bd-4965-b511-66774fa5d7d2" containerName="sg-core" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.788194 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bf1a6da-91bd-4965-b511-66774fa5d7d2" containerName="sg-core" Sep 30 07:51:36 crc kubenswrapper[4760]: E0930 07:51:36.788209 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bf1a6da-91bd-4965-b511-66774fa5d7d2" containerName="ceilometer-notification-agent" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.788451 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bf1a6da-91bd-4965-b511-66774fa5d7d2" containerName="ceilometer-notification-agent" Sep 30 07:51:36 crc kubenswrapper[4760]: E0930 07:51:36.788470 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3eab693-7a8a-4ad2-a247-b8a79f178a87" containerName="mariadb-database-create" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.788548 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3eab693-7a8a-4ad2-a247-b8a79f178a87" containerName="mariadb-database-create" Sep 30 07:51:36 crc kubenswrapper[4760]: E0930 07:51:36.788668 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bf1a6da-91bd-4965-b511-66774fa5d7d2" containerName="ceilometer-central-agent" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.788682 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bf1a6da-91bd-4965-b511-66774fa5d7d2" containerName="ceilometer-central-agent" Sep 30 07:51:36 crc kubenswrapper[4760]: E0930 07:51:36.788697 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bf1a6da-91bd-4965-b511-66774fa5d7d2" containerName="proxy-httpd" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.788703 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bf1a6da-91bd-4965-b511-66774fa5d7d2" containerName="proxy-httpd" Sep 30 07:51:36 crc kubenswrapper[4760]: E0930 07:51:36.788909 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527f2de8-9fbb-4ecb-893d-ffe0db4d524b" containerName="mariadb-database-create" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.788920 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="527f2de8-9fbb-4ecb-893d-ffe0db4d524b" containerName="mariadb-database-create" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.789277 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bf1a6da-91bd-4965-b511-66774fa5d7d2" containerName="proxy-httpd" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.789296 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3eab693-7a8a-4ad2-a247-b8a79f178a87" containerName="mariadb-database-create" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.789520 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bf1a6da-91bd-4965-b511-66774fa5d7d2" containerName="sg-core" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.789534 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="51b8af4a-eb0e-48a7-885d-917c60d526d3" containerName="mariadb-database-create" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.789546 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bf1a6da-91bd-4965-b511-66774fa5d7d2" containerName="ceilometer-central-agent" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.789557 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="527f2de8-9fbb-4ecb-893d-ffe0db4d524b" containerName="mariadb-database-create" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.789566 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bf1a6da-91bd-4965-b511-66774fa5d7d2" containerName="ceilometer-notification-agent" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.791982 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.798688 4760 scope.go:117] "RemoveContainer" containerID="3cc78b8683e481aa50d7fca9ab14f0b7c4528df03e45a92633a2b6bf375cc553" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.799856 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.800041 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.802635 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.841628 4760 scope.go:117] "RemoveContainer" containerID="1f9df15fbf107b05d5c67eee6f78fc4edab2db34feca48802226d8a4ed835b25" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.893346 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/833abb8f-981c-489a-b60e-294479d780d8-run-httpd\") pod \"ceilometer-0\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " pod="openstack/ceilometer-0" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.893382 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/833abb8f-981c-489a-b60e-294479d780d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " pod="openstack/ceilometer-0" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.893407 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npmbq\" (UniqueName: \"kubernetes.io/projected/833abb8f-981c-489a-b60e-294479d780d8-kube-api-access-npmbq\") pod \"ceilometer-0\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " pod="openstack/ceilometer-0" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.893435 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/833abb8f-981c-489a-b60e-294479d780d8-config-data\") pod \"ceilometer-0\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " pod="openstack/ceilometer-0" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.893460 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/833abb8f-981c-489a-b60e-294479d780d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " pod="openstack/ceilometer-0" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.893476 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/833abb8f-981c-489a-b60e-294479d780d8-log-httpd\") pod \"ceilometer-0\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " pod="openstack/ceilometer-0" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.893503 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/833abb8f-981c-489a-b60e-294479d780d8-scripts\") pod \"ceilometer-0\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " pod="openstack/ceilometer-0" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.895658 4760 scope.go:117] "RemoveContainer" containerID="6e6daf0b99936ad2a997f298b22fd0469ba8a863b548504377c4fd353a10ea7e" Sep 30 07:51:36 crc kubenswrapper[4760]: I0930 07:51:36.957278 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-55ffd7b5b9-x7zhf" Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:36.999606 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/833abb8f-981c-489a-b60e-294479d780d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " pod="openstack/ceilometer-0" Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.000490 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/833abb8f-981c-489a-b60e-294479d780d8-run-httpd\") pod \"ceilometer-0\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " pod="openstack/ceilometer-0" Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.000551 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npmbq\" (UniqueName: \"kubernetes.io/projected/833abb8f-981c-489a-b60e-294479d780d8-kube-api-access-npmbq\") pod \"ceilometer-0\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " pod="openstack/ceilometer-0" Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.000604 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/833abb8f-981c-489a-b60e-294479d780d8-config-data\") pod \"ceilometer-0\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " pod="openstack/ceilometer-0" Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.000642 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/833abb8f-981c-489a-b60e-294479d780d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " pod="openstack/ceilometer-0" Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.000685 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/833abb8f-981c-489a-b60e-294479d780d8-log-httpd\") pod \"ceilometer-0\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " pod="openstack/ceilometer-0" Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.000725 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/833abb8f-981c-489a-b60e-294479d780d8-scripts\") pod \"ceilometer-0\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " pod="openstack/ceilometer-0" Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.004247 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/833abb8f-981c-489a-b60e-294479d780d8-log-httpd\") pod \"ceilometer-0\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " pod="openstack/ceilometer-0" Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.004575 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/833abb8f-981c-489a-b60e-294479d780d8-run-httpd\") pod \"ceilometer-0\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " pod="openstack/ceilometer-0" Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.016709 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/833abb8f-981c-489a-b60e-294479d780d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " pod="openstack/ceilometer-0" Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.017625 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/833abb8f-981c-489a-b60e-294479d780d8-scripts\") pod \"ceilometer-0\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " pod="openstack/ceilometer-0" Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.018795 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/833abb8f-981c-489a-b60e-294479d780d8-config-data\") pod \"ceilometer-0\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " pod="openstack/ceilometer-0" Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.033492 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npmbq\" (UniqueName: \"kubernetes.io/projected/833abb8f-981c-489a-b60e-294479d780d8-kube-api-access-npmbq\") pod \"ceilometer-0\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " pod="openstack/ceilometer-0" Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.060348 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/833abb8f-981c-489a-b60e-294479d780d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " pod="openstack/ceilometer-0" Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.100900 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bf1a6da-91bd-4965-b511-66774fa5d7d2" path="/var/lib/kubelet/pods/5bf1a6da-91bd-4965-b511-66774fa5d7d2/volumes" Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.101850 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-79bccb96b8-8wjx5"] Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.102123 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-79bccb96b8-8wjx5" podUID="f3ec89e4-64de-42eb-959a-c064d716f0f3" containerName="neutron-api" containerID="cri-o://6b934f7a242cca97ceaa8b52af243c4cd67feceb3bc8fcfd262e5081f8a11717" gracePeriod=30 Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.102585 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-79bccb96b8-8wjx5" podUID="f3ec89e4-64de-42eb-959a-c064d716f0f3" containerName="neutron-httpd" containerID="cri-o://6ebf877d72f8fe806749d8c4fb727fcb9bbcc7b73819d031594b52f988949d46" gracePeriod=30 Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.147068 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.665538 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.698537 4760 generic.go:334] "Generic (PLEG): container finished" podID="f3ec89e4-64de-42eb-959a-c064d716f0f3" containerID="6ebf877d72f8fe806749d8c4fb727fcb9bbcc7b73819d031594b52f988949d46" exitCode=0 Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.698598 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79bccb96b8-8wjx5" event={"ID":"f3ec89e4-64de-42eb-959a-c064d716f0f3","Type":"ContainerDied","Data":"6ebf877d72f8fe806749d8c4fb727fcb9bbcc7b73819d031594b52f988949d46"} Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.700228 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"40978ac5-870c-4273-8814-6c735435ca09","Type":"ContainerStarted","Data":"6b46cedfeed3f825bd3f1275c8c2908927493d56868f67bff2c999e1dbd9c1c5"} Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.700255 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"40978ac5-870c-4273-8814-6c735435ca09","Type":"ContainerStarted","Data":"061372a9fcd93827a2752dbcf01a54dd7e453ef744896f6a50abbf06dbd70c83"} Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.700428 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="40978ac5-870c-4273-8814-6c735435ca09" containerName="cinder-api-log" containerID="cri-o://6b46cedfeed3f825bd3f1275c8c2908927493d56868f67bff2c999e1dbd9c1c5" gracePeriod=30 Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.700685 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.700918 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="40978ac5-870c-4273-8814-6c735435ca09" containerName="cinder-api" containerID="cri-o://061372a9fcd93827a2752dbcf01a54dd7e453ef744896f6a50abbf06dbd70c83" gracePeriod=30 Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.703050 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5c13e0bb-80b7-4846-b26c-5c60b234d6fa","Type":"ContainerStarted","Data":"3917dcda16f3ff66385559764fbb9f61e9e85cc4e3bdee25550c68b52bbb1056"} Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.712021 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"833abb8f-981c-489a-b60e-294479d780d8","Type":"ContainerStarted","Data":"4c7ecde806c27c25736f98a82044af1c94dc7c89ae6d8d636289b48b7c40b56b"} Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.721637 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.721615865 podStartE2EDuration="4.721615865s" podCreationTimestamp="2025-09-30 07:51:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:51:37.717092639 +0000 UTC m=+1083.359999051" watchObservedRunningTime="2025-09-30 07:51:37.721615865 +0000 UTC m=+1083.364522277" Sep 30 07:51:37 crc kubenswrapper[4760]: I0930 07:51:37.738518 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.943204102 podStartE2EDuration="4.738495885s" podCreationTimestamp="2025-09-30 07:51:33 +0000 UTC" firstStartedPulling="2025-09-30 07:51:34.505122216 +0000 UTC m=+1080.148028628" lastFinishedPulling="2025-09-30 07:51:35.300413999 +0000 UTC m=+1080.943320411" observedRunningTime="2025-09-30 07:51:37.736117104 +0000 UTC m=+1083.379023516" watchObservedRunningTime="2025-09-30 07:51:37.738495885 +0000 UTC m=+1083.381402297" Sep 30 07:51:38 crc kubenswrapper[4760]: I0930 07:51:38.721613 4760 generic.go:334] "Generic (PLEG): container finished" podID="40978ac5-870c-4273-8814-6c735435ca09" containerID="6b46cedfeed3f825bd3f1275c8c2908927493d56868f67bff2c999e1dbd9c1c5" exitCode=143 Sep 30 07:51:38 crc kubenswrapper[4760]: I0930 07:51:38.721714 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"40978ac5-870c-4273-8814-6c735435ca09","Type":"ContainerDied","Data":"6b46cedfeed3f825bd3f1275c8c2908927493d56868f67bff2c999e1dbd9c1c5"} Sep 30 07:51:38 crc kubenswrapper[4760]: I0930 07:51:38.723665 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"833abb8f-981c-489a-b60e-294479d780d8","Type":"ContainerStarted","Data":"6d119f473a0247e9326baf6deeae15b8826c596bae99eaa9c536db192617ec8b"} Sep 30 07:51:38 crc kubenswrapper[4760]: I0930 07:51:38.923625 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 30 07:51:39 crc kubenswrapper[4760]: I0930 07:51:39.516529 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 07:51:39 crc kubenswrapper[4760]: I0930 07:51:39.516978 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="dec298fa-4de5-4a26-bc21-409707df4ddb" containerName="glance-log" containerID="cri-o://593e37b4d6f18506afadd01f354e5c33e3943695339a78fd18717f7241e269a9" gracePeriod=30 Sep 30 07:51:39 crc kubenswrapper[4760]: I0930 07:51:39.517096 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="dec298fa-4de5-4a26-bc21-409707df4ddb" containerName="glance-httpd" containerID="cri-o://0e3510ee726f0a74838e5a325ae1a1d3dbee09c20c8722d032ba23ebd4c965ae" gracePeriod=30 Sep 30 07:51:39 crc kubenswrapper[4760]: I0930 07:51:39.735184 4760 generic.go:334] "Generic (PLEG): container finished" podID="dec298fa-4de5-4a26-bc21-409707df4ddb" containerID="593e37b4d6f18506afadd01f354e5c33e3943695339a78fd18717f7241e269a9" exitCode=143 Sep 30 07:51:39 crc kubenswrapper[4760]: I0930 07:51:39.735279 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dec298fa-4de5-4a26-bc21-409707df4ddb","Type":"ContainerDied","Data":"593e37b4d6f18506afadd01f354e5c33e3943695339a78fd18717f7241e269a9"} Sep 30 07:51:39 crc kubenswrapper[4760]: I0930 07:51:39.738201 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"833abb8f-981c-489a-b60e-294479d780d8","Type":"ContainerStarted","Data":"3c0bc4c2e06007b390ceb727d84382eba7463a9de49c4d1c74bb4f029f3e2c8f"} Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.363754 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.377955 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.378179 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bbf081e8-29d3-46ed-8474-e027d6c28c1d" containerName="glance-log" containerID="cri-o://495694d991a90664ce4b5d0bd317dffe03e9480518f5da9230a6675bd99314b1" gracePeriod=30 Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.378528 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bbf081e8-29d3-46ed-8474-e027d6c28c1d" containerName="glance-httpd" containerID="cri-o://15cfa7febc327d6ae0fa0a81ca83ab7a264a545a3590f385be1ee68176fe88f2" gracePeriod=30 Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.577749 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.665871 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-horizon-secret-key\") pod \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.665971 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-logs\") pod \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.666031 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-config-data\") pod \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.666054 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-horizon-tls-certs\") pod \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.666125 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-combined-ca-bundle\") pod \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.666180 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-scripts\") pod \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.666207 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzls6\" (UniqueName: \"kubernetes.io/projected/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-kube-api-access-pzls6\") pod \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\" (UID: \"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37\") " Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.666745 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-logs" (OuterVolumeSpecName: "logs") pod "e28d9015-ea18-4da0-bfd8-c2cc5dec4f37" (UID: "e28d9015-ea18-4da0-bfd8-c2cc5dec4f37"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.672683 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e28d9015-ea18-4da0-bfd8-c2cc5dec4f37" (UID: "e28d9015-ea18-4da0-bfd8-c2cc5dec4f37"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.680931 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-kube-api-access-pzls6" (OuterVolumeSpecName: "kube-api-access-pzls6") pod "e28d9015-ea18-4da0-bfd8-c2cc5dec4f37" (UID: "e28d9015-ea18-4da0-bfd8-c2cc5dec4f37"). InnerVolumeSpecName "kube-api-access-pzls6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.699787 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-scripts" (OuterVolumeSpecName: "scripts") pod "e28d9015-ea18-4da0-bfd8-c2cc5dec4f37" (UID: "e28d9015-ea18-4da0-bfd8-c2cc5dec4f37"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.715421 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e28d9015-ea18-4da0-bfd8-c2cc5dec4f37" (UID: "e28d9015-ea18-4da0-bfd8-c2cc5dec4f37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.726636 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-config-data" (OuterVolumeSpecName: "config-data") pod "e28d9015-ea18-4da0-bfd8-c2cc5dec4f37" (UID: "e28d9015-ea18-4da0-bfd8-c2cc5dec4f37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.768400 4760 generic.go:334] "Generic (PLEG): container finished" podID="bbf081e8-29d3-46ed-8474-e027d6c28c1d" containerID="495694d991a90664ce4b5d0bd317dffe03e9480518f5da9230a6675bd99314b1" exitCode=143 Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.768491 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bbf081e8-29d3-46ed-8474-e027d6c28c1d","Type":"ContainerDied","Data":"495694d991a90664ce4b5d0bd317dffe03e9480518f5da9230a6675bd99314b1"} Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.769469 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.769520 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.769532 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzls6\" (UniqueName: \"kubernetes.io/projected/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-kube-api-access-pzls6\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.769544 4760 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.769553 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-logs\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.769565 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.771050 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"833abb8f-981c-489a-b60e-294479d780d8","Type":"ContainerStarted","Data":"2683d62502143d345d3dd678c2041a0b4ce99f568b193663b2d524e00632e462"} Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.773902 4760 generic.go:334] "Generic (PLEG): container finished" podID="e28d9015-ea18-4da0-bfd8-c2cc5dec4f37" containerID="5d579c31c3d1a80d0194fb2ea0a7ddf6de3190ad92735e3f6c6c498a6b3f551b" exitCode=137 Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.773942 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-694db87c64-qrwhp" event={"ID":"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37","Type":"ContainerDied","Data":"5d579c31c3d1a80d0194fb2ea0a7ddf6de3190ad92735e3f6c6c498a6b3f551b"} Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.773960 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-694db87c64-qrwhp" event={"ID":"e28d9015-ea18-4da0-bfd8-c2cc5dec4f37","Type":"ContainerDied","Data":"849b2badb825e078b2b6f0b3b70a51c8064145288c1cf2da692d15a9af8c3e09"} Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.773999 4760 scope.go:117] "RemoveContainer" containerID="ffd16f7b799eef3e712c57eb371aed74561eea257b32f299da023db303acb61b" Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.774265 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-694db87c64-qrwhp" Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.789340 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "e28d9015-ea18-4da0-bfd8-c2cc5dec4f37" (UID: "e28d9015-ea18-4da0-bfd8-c2cc5dec4f37"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.871697 4760 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.958257 4760 scope.go:117] "RemoveContainer" containerID="5d579c31c3d1a80d0194fb2ea0a7ddf6de3190ad92735e3f6c6c498a6b3f551b" Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.997202 4760 scope.go:117] "RemoveContainer" containerID="ffd16f7b799eef3e712c57eb371aed74561eea257b32f299da023db303acb61b" Sep 30 07:51:40 crc kubenswrapper[4760]: E0930 07:51:40.997527 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffd16f7b799eef3e712c57eb371aed74561eea257b32f299da023db303acb61b\": container with ID starting with ffd16f7b799eef3e712c57eb371aed74561eea257b32f299da023db303acb61b not found: ID does not exist" containerID="ffd16f7b799eef3e712c57eb371aed74561eea257b32f299da023db303acb61b" Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.997559 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffd16f7b799eef3e712c57eb371aed74561eea257b32f299da023db303acb61b"} err="failed to get container status \"ffd16f7b799eef3e712c57eb371aed74561eea257b32f299da023db303acb61b\": rpc error: code = NotFound desc = could not find container \"ffd16f7b799eef3e712c57eb371aed74561eea257b32f299da023db303acb61b\": container with ID starting with ffd16f7b799eef3e712c57eb371aed74561eea257b32f299da023db303acb61b not found: ID does not exist" Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.997588 4760 scope.go:117] "RemoveContainer" containerID="5d579c31c3d1a80d0194fb2ea0a7ddf6de3190ad92735e3f6c6c498a6b3f551b" Sep 30 07:51:40 crc kubenswrapper[4760]: E0930 07:51:40.997818 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d579c31c3d1a80d0194fb2ea0a7ddf6de3190ad92735e3f6c6c498a6b3f551b\": container with ID starting with 5d579c31c3d1a80d0194fb2ea0a7ddf6de3190ad92735e3f6c6c498a6b3f551b not found: ID does not exist" containerID="5d579c31c3d1a80d0194fb2ea0a7ddf6de3190ad92735e3f6c6c498a6b3f551b" Sep 30 07:51:40 crc kubenswrapper[4760]: I0930 07:51:40.997841 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d579c31c3d1a80d0194fb2ea0a7ddf6de3190ad92735e3f6c6c498a6b3f551b"} err="failed to get container status \"5d579c31c3d1a80d0194fb2ea0a7ddf6de3190ad92735e3f6c6c498a6b3f551b\": rpc error: code = NotFound desc = could not find container \"5d579c31c3d1a80d0194fb2ea0a7ddf6de3190ad92735e3f6c6c498a6b3f551b\": container with ID starting with 5d579c31c3d1a80d0194fb2ea0a7ddf6de3190ad92735e3f6c6c498a6b3f551b not found: ID does not exist" Sep 30 07:51:41 crc kubenswrapper[4760]: I0930 07:51:41.110430 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-694db87c64-qrwhp"] Sep 30 07:51:41 crc kubenswrapper[4760]: I0930 07:51:41.118788 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-694db87c64-qrwhp"] Sep 30 07:51:41 crc kubenswrapper[4760]: I0930 07:51:41.799289 4760 generic.go:334] "Generic (PLEG): container finished" podID="f3ec89e4-64de-42eb-959a-c064d716f0f3" containerID="6b934f7a242cca97ceaa8b52af243c4cd67feceb3bc8fcfd262e5081f8a11717" exitCode=0 Sep 30 07:51:41 crc kubenswrapper[4760]: I0930 07:51:41.799459 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79bccb96b8-8wjx5" event={"ID":"f3ec89e4-64de-42eb-959a-c064d716f0f3","Type":"ContainerDied","Data":"6b934f7a242cca97ceaa8b52af243c4cd67feceb3bc8fcfd262e5081f8a11717"} Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.380335 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79bccb96b8-8wjx5" Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.505791 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3ec89e4-64de-42eb-959a-c064d716f0f3-config\") pod \"f3ec89e4-64de-42eb-959a-c064d716f0f3\" (UID: \"f3ec89e4-64de-42eb-959a-c064d716f0f3\") " Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.505953 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ec89e4-64de-42eb-959a-c064d716f0f3-combined-ca-bundle\") pod \"f3ec89e4-64de-42eb-959a-c064d716f0f3\" (UID: \"f3ec89e4-64de-42eb-959a-c064d716f0f3\") " Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.506034 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3ec89e4-64de-42eb-959a-c064d716f0f3-ovndb-tls-certs\") pod \"f3ec89e4-64de-42eb-959a-c064d716f0f3\" (UID: \"f3ec89e4-64de-42eb-959a-c064d716f0f3\") " Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.506199 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crjt5\" (UniqueName: \"kubernetes.io/projected/f3ec89e4-64de-42eb-959a-c064d716f0f3-kube-api-access-crjt5\") pod \"f3ec89e4-64de-42eb-959a-c064d716f0f3\" (UID: \"f3ec89e4-64de-42eb-959a-c064d716f0f3\") " Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.506254 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f3ec89e4-64de-42eb-959a-c064d716f0f3-httpd-config\") pod \"f3ec89e4-64de-42eb-959a-c064d716f0f3\" (UID: \"f3ec89e4-64de-42eb-959a-c064d716f0f3\") " Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.521158 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3ec89e4-64de-42eb-959a-c064d716f0f3-kube-api-access-crjt5" (OuterVolumeSpecName: "kube-api-access-crjt5") pod "f3ec89e4-64de-42eb-959a-c064d716f0f3" (UID: "f3ec89e4-64de-42eb-959a-c064d716f0f3"). InnerVolumeSpecName "kube-api-access-crjt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.525439 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ec89e4-64de-42eb-959a-c064d716f0f3-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f3ec89e4-64de-42eb-959a-c064d716f0f3" (UID: "f3ec89e4-64de-42eb-959a-c064d716f0f3"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.570294 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ec89e4-64de-42eb-959a-c064d716f0f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3ec89e4-64de-42eb-959a-c064d716f0f3" (UID: "f3ec89e4-64de-42eb-959a-c064d716f0f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.585818 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ec89e4-64de-42eb-959a-c064d716f0f3-config" (OuterVolumeSpecName: "config") pod "f3ec89e4-64de-42eb-959a-c064d716f0f3" (UID: "f3ec89e4-64de-42eb-959a-c064d716f0f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.587966 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ec89e4-64de-42eb-959a-c064d716f0f3-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f3ec89e4-64de-42eb-959a-c064d716f0f3" (UID: "f3ec89e4-64de-42eb-959a-c064d716f0f3"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.608050 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ec89e4-64de-42eb-959a-c064d716f0f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.608435 4760 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3ec89e4-64de-42eb-959a-c064d716f0f3-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.608451 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crjt5\" (UniqueName: \"kubernetes.io/projected/f3ec89e4-64de-42eb-959a-c064d716f0f3-kube-api-access-crjt5\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.608463 4760 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f3ec89e4-64de-42eb-959a-c064d716f0f3-httpd-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.608476 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3ec89e4-64de-42eb-959a-c064d716f0f3-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.819555 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79bccb96b8-8wjx5" Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.819831 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79bccb96b8-8wjx5" event={"ID":"f3ec89e4-64de-42eb-959a-c064d716f0f3","Type":"ContainerDied","Data":"c340b3235bd6b1753de898cb671d06b4037fecac4448f26d0d563dee3140e525"} Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.819908 4760 scope.go:117] "RemoveContainer" containerID="6ebf877d72f8fe806749d8c4fb727fcb9bbcc7b73819d031594b52f988949d46" Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.826359 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"833abb8f-981c-489a-b60e-294479d780d8","Type":"ContainerStarted","Data":"127513d66ce615a75bedeef4b732dbc8ab907e782d9ee6481deee835a4d059cb"} Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.826575 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="833abb8f-981c-489a-b60e-294479d780d8" containerName="ceilometer-central-agent" containerID="cri-o://6d119f473a0247e9326baf6deeae15b8826c596bae99eaa9c536db192617ec8b" gracePeriod=30 Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.826915 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.827336 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="833abb8f-981c-489a-b60e-294479d780d8" containerName="proxy-httpd" containerID="cri-o://127513d66ce615a75bedeef4b732dbc8ab907e782d9ee6481deee835a4d059cb" gracePeriod=30 Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.827429 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="833abb8f-981c-489a-b60e-294479d780d8" containerName="sg-core" containerID="cri-o://2683d62502143d345d3dd678c2041a0b4ce99f568b193663b2d524e00632e462" gracePeriod=30 Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.827485 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="833abb8f-981c-489a-b60e-294479d780d8" containerName="ceilometer-notification-agent" containerID="cri-o://3c0bc4c2e06007b390ceb727d84382eba7463a9de49c4d1c74bb4f029f3e2c8f" gracePeriod=30 Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.832445 4760 generic.go:334] "Generic (PLEG): container finished" podID="dec298fa-4de5-4a26-bc21-409707df4ddb" containerID="0e3510ee726f0a74838e5a325ae1a1d3dbee09c20c8722d032ba23ebd4c965ae" exitCode=0 Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.832503 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dec298fa-4de5-4a26-bc21-409707df4ddb","Type":"ContainerDied","Data":"0e3510ee726f0a74838e5a325ae1a1d3dbee09c20c8722d032ba23ebd4c965ae"} Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.853443 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.822628126 podStartE2EDuration="6.853284518s" podCreationTimestamp="2025-09-30 07:51:36 +0000 UTC" firstStartedPulling="2025-09-30 07:51:37.673440307 +0000 UTC m=+1083.316346729" lastFinishedPulling="2025-09-30 07:51:41.704096679 +0000 UTC m=+1087.347003121" observedRunningTime="2025-09-30 07:51:42.843287383 +0000 UTC m=+1088.486193795" watchObservedRunningTime="2025-09-30 07:51:42.853284518 +0000 UTC m=+1088.496190930" Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.894426 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-79bccb96b8-8wjx5"] Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.894622 4760 scope.go:117] "RemoveContainer" containerID="6b934f7a242cca97ceaa8b52af243c4cd67feceb3bc8fcfd262e5081f8a11717" Sep 30 07:51:42 crc kubenswrapper[4760]: I0930 07:51:42.900941 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-79bccb96b8-8wjx5"] Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.110130 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e28d9015-ea18-4da0-bfd8-c2cc5dec4f37" path="/var/lib/kubelet/pods/e28d9015-ea18-4da0-bfd8-c2cc5dec4f37/volumes" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.110765 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3ec89e4-64de-42eb-959a-c064d716f0f3" path="/var/lib/kubelet/pods/f3ec89e4-64de-42eb-959a-c064d716f0f3/volumes" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.176269 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.321845 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec298fa-4de5-4a26-bc21-409707df4ddb-combined-ca-bundle\") pod \"dec298fa-4de5-4a26-bc21-409707df4ddb\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.322117 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"dec298fa-4de5-4a26-bc21-409707df4ddb\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.322164 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dec298fa-4de5-4a26-bc21-409707df4ddb-public-tls-certs\") pod \"dec298fa-4de5-4a26-bc21-409707df4ddb\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.322234 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec298fa-4de5-4a26-bc21-409707df4ddb-config-data\") pod \"dec298fa-4de5-4a26-bc21-409707df4ddb\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.322282 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dec298fa-4de5-4a26-bc21-409707df4ddb-scripts\") pod \"dec298fa-4de5-4a26-bc21-409707df4ddb\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.322344 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpq5h\" (UniqueName: \"kubernetes.io/projected/dec298fa-4de5-4a26-bc21-409707df4ddb-kube-api-access-dpq5h\") pod \"dec298fa-4de5-4a26-bc21-409707df4ddb\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.322370 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dec298fa-4de5-4a26-bc21-409707df4ddb-httpd-run\") pod \"dec298fa-4de5-4a26-bc21-409707df4ddb\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.322389 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dec298fa-4de5-4a26-bc21-409707df4ddb-logs\") pod \"dec298fa-4de5-4a26-bc21-409707df4ddb\" (UID: \"dec298fa-4de5-4a26-bc21-409707df4ddb\") " Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.323187 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dec298fa-4de5-4a26-bc21-409707df4ddb-logs" (OuterVolumeSpecName: "logs") pod "dec298fa-4de5-4a26-bc21-409707df4ddb" (UID: "dec298fa-4de5-4a26-bc21-409707df4ddb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.323440 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dec298fa-4de5-4a26-bc21-409707df4ddb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "dec298fa-4de5-4a26-bc21-409707df4ddb" (UID: "dec298fa-4de5-4a26-bc21-409707df4ddb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.328932 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "dec298fa-4de5-4a26-bc21-409707df4ddb" (UID: "dec298fa-4de5-4a26-bc21-409707df4ddb"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.328980 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dec298fa-4de5-4a26-bc21-409707df4ddb-kube-api-access-dpq5h" (OuterVolumeSpecName: "kube-api-access-dpq5h") pod "dec298fa-4de5-4a26-bc21-409707df4ddb" (UID: "dec298fa-4de5-4a26-bc21-409707df4ddb"). InnerVolumeSpecName "kube-api-access-dpq5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.347546 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec298fa-4de5-4a26-bc21-409707df4ddb-scripts" (OuterVolumeSpecName: "scripts") pod "dec298fa-4de5-4a26-bc21-409707df4ddb" (UID: "dec298fa-4de5-4a26-bc21-409707df4ddb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.354463 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec298fa-4de5-4a26-bc21-409707df4ddb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dec298fa-4de5-4a26-bc21-409707df4ddb" (UID: "dec298fa-4de5-4a26-bc21-409707df4ddb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.383471 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec298fa-4de5-4a26-bc21-409707df4ddb-config-data" (OuterVolumeSpecName: "config-data") pod "dec298fa-4de5-4a26-bc21-409707df4ddb" (UID: "dec298fa-4de5-4a26-bc21-409707df4ddb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.418387 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec298fa-4de5-4a26-bc21-409707df4ddb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dec298fa-4de5-4a26-bc21-409707df4ddb" (UID: "dec298fa-4de5-4a26-bc21-409707df4ddb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.424681 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec298fa-4de5-4a26-bc21-409707df4ddb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.424735 4760 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.424748 4760 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dec298fa-4de5-4a26-bc21-409707df4ddb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.424757 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec298fa-4de5-4a26-bc21-409707df4ddb-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.424765 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dec298fa-4de5-4a26-bc21-409707df4ddb-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.424774 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpq5h\" (UniqueName: \"kubernetes.io/projected/dec298fa-4de5-4a26-bc21-409707df4ddb-kube-api-access-dpq5h\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.424791 4760 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dec298fa-4de5-4a26-bc21-409707df4ddb-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.424799 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dec298fa-4de5-4a26-bc21-409707df4ddb-logs\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.446032 4760 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.526782 4760 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.842707 4760 generic.go:334] "Generic (PLEG): container finished" podID="833abb8f-981c-489a-b60e-294479d780d8" containerID="127513d66ce615a75bedeef4b732dbc8ab907e782d9ee6481deee835a4d059cb" exitCode=0 Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.843628 4760 generic.go:334] "Generic (PLEG): container finished" podID="833abb8f-981c-489a-b60e-294479d780d8" containerID="2683d62502143d345d3dd678c2041a0b4ce99f568b193663b2d524e00632e462" exitCode=2 Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.843695 4760 generic.go:334] "Generic (PLEG): container finished" podID="833abb8f-981c-489a-b60e-294479d780d8" containerID="3c0bc4c2e06007b390ceb727d84382eba7463a9de49c4d1c74bb4f029f3e2c8f" exitCode=0 Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.842865 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"833abb8f-981c-489a-b60e-294479d780d8","Type":"ContainerDied","Data":"127513d66ce615a75bedeef4b732dbc8ab907e782d9ee6481deee835a4d059cb"} Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.843868 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"833abb8f-981c-489a-b60e-294479d780d8","Type":"ContainerDied","Data":"2683d62502143d345d3dd678c2041a0b4ce99f568b193663b2d524e00632e462"} Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.843961 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"833abb8f-981c-489a-b60e-294479d780d8","Type":"ContainerDied","Data":"3c0bc4c2e06007b390ceb727d84382eba7463a9de49c4d1c74bb4f029f3e2c8f"} Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.847514 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dec298fa-4de5-4a26-bc21-409707df4ddb","Type":"ContainerDied","Data":"5be6b0e6208863a2eb05dc66dc0b70f59e0f635465838ba525a8e824b7a2cabe"} Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.847680 4760 scope.go:117] "RemoveContainer" containerID="0e3510ee726f0a74838e5a325ae1a1d3dbee09c20c8722d032ba23ebd4c965ae" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.847701 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.879708 4760 scope.go:117] "RemoveContainer" containerID="593e37b4d6f18506afadd01f354e5c33e3943695339a78fd18717f7241e269a9" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.890994 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.903343 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.923692 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 07:51:43 crc kubenswrapper[4760]: E0930 07:51:43.943494 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec298fa-4de5-4a26-bc21-409707df4ddb" containerName="glance-httpd" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.943549 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec298fa-4de5-4a26-bc21-409707df4ddb" containerName="glance-httpd" Sep 30 07:51:43 crc kubenswrapper[4760]: E0930 07:51:43.943589 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3ec89e4-64de-42eb-959a-c064d716f0f3" containerName="neutron-httpd" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.943603 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ec89e4-64de-42eb-959a-c064d716f0f3" containerName="neutron-httpd" Sep 30 07:51:43 crc kubenswrapper[4760]: E0930 07:51:43.943627 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec298fa-4de5-4a26-bc21-409707df4ddb" containerName="glance-log" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.943643 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec298fa-4de5-4a26-bc21-409707df4ddb" containerName="glance-log" Sep 30 07:51:43 crc kubenswrapper[4760]: E0930 07:51:43.943762 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e28d9015-ea18-4da0-bfd8-c2cc5dec4f37" containerName="horizon" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.943778 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="e28d9015-ea18-4da0-bfd8-c2cc5dec4f37" containerName="horizon" Sep 30 07:51:43 crc kubenswrapper[4760]: E0930 07:51:43.943870 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e28d9015-ea18-4da0-bfd8-c2cc5dec4f37" containerName="horizon-log" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.943887 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="e28d9015-ea18-4da0-bfd8-c2cc5dec4f37" containerName="horizon-log" Sep 30 07:51:43 crc kubenswrapper[4760]: E0930 07:51:43.943936 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3ec89e4-64de-42eb-959a-c064d716f0f3" containerName="neutron-api" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.943949 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ec89e4-64de-42eb-959a-c064d716f0f3" containerName="neutron-api" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.948120 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3ec89e4-64de-42eb-959a-c064d716f0f3" containerName="neutron-httpd" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.948198 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="dec298fa-4de5-4a26-bc21-409707df4ddb" containerName="glance-httpd" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.948223 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="dec298fa-4de5-4a26-bc21-409707df4ddb" containerName="glance-log" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.948251 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="e28d9015-ea18-4da0-bfd8-c2cc5dec4f37" containerName="horizon-log" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.948287 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="e28d9015-ea18-4da0-bfd8-c2cc5dec4f37" containerName="horizon" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.948368 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3ec89e4-64de-42eb-959a-c064d716f0f3" containerName="neutron-api" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.956945 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.961498 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.961793 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.969928 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.973930 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19ddce20-85ae-4537-86f4-33a6b35fef0b-config-data\") pod \"glance-default-external-api-0\" (UID: \"19ddce20-85ae-4537-86f4-33a6b35fef0b\") " pod="openstack/glance-default-external-api-0" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.974048 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19ddce20-85ae-4537-86f4-33a6b35fef0b-logs\") pod \"glance-default-external-api-0\" (UID: \"19ddce20-85ae-4537-86f4-33a6b35fef0b\") " pod="openstack/glance-default-external-api-0" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.974090 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"19ddce20-85ae-4537-86f4-33a6b35fef0b\") " pod="openstack/glance-default-external-api-0" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.974144 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fft2\" (UniqueName: \"kubernetes.io/projected/19ddce20-85ae-4537-86f4-33a6b35fef0b-kube-api-access-6fft2\") pod \"glance-default-external-api-0\" (UID: \"19ddce20-85ae-4537-86f4-33a6b35fef0b\") " pod="openstack/glance-default-external-api-0" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.974177 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/19ddce20-85ae-4537-86f4-33a6b35fef0b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"19ddce20-85ae-4537-86f4-33a6b35fef0b\") " pod="openstack/glance-default-external-api-0" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.974267 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19ddce20-85ae-4537-86f4-33a6b35fef0b-scripts\") pod \"glance-default-external-api-0\" (UID: \"19ddce20-85ae-4537-86f4-33a6b35fef0b\") " pod="openstack/glance-default-external-api-0" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.974358 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19ddce20-85ae-4537-86f4-33a6b35fef0b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"19ddce20-85ae-4537-86f4-33a6b35fef0b\") " pod="openstack/glance-default-external-api-0" Sep 30 07:51:43 crc kubenswrapper[4760]: I0930 07:51:43.974461 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/19ddce20-85ae-4537-86f4-33a6b35fef0b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"19ddce20-85ae-4537-86f4-33a6b35fef0b\") " pod="openstack/glance-default-external-api-0" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.075914 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19ddce20-85ae-4537-86f4-33a6b35fef0b-config-data\") pod \"glance-default-external-api-0\" (UID: \"19ddce20-85ae-4537-86f4-33a6b35fef0b\") " pod="openstack/glance-default-external-api-0" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.075991 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19ddce20-85ae-4537-86f4-33a6b35fef0b-logs\") pod \"glance-default-external-api-0\" (UID: \"19ddce20-85ae-4537-86f4-33a6b35fef0b\") " pod="openstack/glance-default-external-api-0" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.076017 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"19ddce20-85ae-4537-86f4-33a6b35fef0b\") " pod="openstack/glance-default-external-api-0" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.076045 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fft2\" (UniqueName: \"kubernetes.io/projected/19ddce20-85ae-4537-86f4-33a6b35fef0b-kube-api-access-6fft2\") pod \"glance-default-external-api-0\" (UID: \"19ddce20-85ae-4537-86f4-33a6b35fef0b\") " pod="openstack/glance-default-external-api-0" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.076067 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/19ddce20-85ae-4537-86f4-33a6b35fef0b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"19ddce20-85ae-4537-86f4-33a6b35fef0b\") " pod="openstack/glance-default-external-api-0" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.076115 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19ddce20-85ae-4537-86f4-33a6b35fef0b-scripts\") pod \"glance-default-external-api-0\" (UID: \"19ddce20-85ae-4537-86f4-33a6b35fef0b\") " pod="openstack/glance-default-external-api-0" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.076155 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19ddce20-85ae-4537-86f4-33a6b35fef0b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"19ddce20-85ae-4537-86f4-33a6b35fef0b\") " pod="openstack/glance-default-external-api-0" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.076207 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/19ddce20-85ae-4537-86f4-33a6b35fef0b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"19ddce20-85ae-4537-86f4-33a6b35fef0b\") " pod="openstack/glance-default-external-api-0" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.076573 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"19ddce20-85ae-4537-86f4-33a6b35fef0b\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.076625 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19ddce20-85ae-4537-86f4-33a6b35fef0b-logs\") pod \"glance-default-external-api-0\" (UID: \"19ddce20-85ae-4537-86f4-33a6b35fef0b\") " pod="openstack/glance-default-external-api-0" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.076826 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/19ddce20-85ae-4537-86f4-33a6b35fef0b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"19ddce20-85ae-4537-86f4-33a6b35fef0b\") " pod="openstack/glance-default-external-api-0" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.082175 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19ddce20-85ae-4537-86f4-33a6b35fef0b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"19ddce20-85ae-4537-86f4-33a6b35fef0b\") " pod="openstack/glance-default-external-api-0" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.082925 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/19ddce20-85ae-4537-86f4-33a6b35fef0b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"19ddce20-85ae-4537-86f4-33a6b35fef0b\") " pod="openstack/glance-default-external-api-0" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.090742 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19ddce20-85ae-4537-86f4-33a6b35fef0b-scripts\") pod \"glance-default-external-api-0\" (UID: \"19ddce20-85ae-4537-86f4-33a6b35fef0b\") " pod="openstack/glance-default-external-api-0" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.094284 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19ddce20-85ae-4537-86f4-33a6b35fef0b-config-data\") pod \"glance-default-external-api-0\" (UID: \"19ddce20-85ae-4537-86f4-33a6b35fef0b\") " pod="openstack/glance-default-external-api-0" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.099239 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fft2\" (UniqueName: \"kubernetes.io/projected/19ddce20-85ae-4537-86f4-33a6b35fef0b-kube-api-access-6fft2\") pod \"glance-default-external-api-0\" (UID: \"19ddce20-85ae-4537-86f4-33a6b35fef0b\") " pod="openstack/glance-default-external-api-0" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.126875 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"19ddce20-85ae-4537-86f4-33a6b35fef0b\") " pod="openstack/glance-default-external-api-0" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.184083 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.259810 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-2v8mj"] Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.260033 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" podUID="c3f46a87-7a28-45d7-ad4d-98b0ce508557" containerName="dnsmasq-dns" containerID="cri-o://4a40042be6d960c45e1eefa8089a41c216ac486c34b1cb31e9864e2d389a219b" gracePeriod=10 Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.280014 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.317278 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.391093 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.726194 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.811973 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.869099 4760 generic.go:334] "Generic (PLEG): container finished" podID="bbf081e8-29d3-46ed-8474-e027d6c28c1d" containerID="15cfa7febc327d6ae0fa0a81ca83ab7a264a545a3590f385be1ee68176fe88f2" exitCode=0 Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.869189 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bbf081e8-29d3-46ed-8474-e027d6c28c1d","Type":"ContainerDied","Data":"15cfa7febc327d6ae0fa0a81ca83ab7a264a545a3590f385be1ee68176fe88f2"} Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.869225 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bbf081e8-29d3-46ed-8474-e027d6c28c1d","Type":"ContainerDied","Data":"0e948272245e33338d045716cc962b620641c70744289219859349abb8715afa"} Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.869226 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.869258 4760 scope.go:117] "RemoveContainer" containerID="15cfa7febc327d6ae0fa0a81ca83ab7a264a545a3590f385be1ee68176fe88f2" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.890891 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bbf081e8-29d3-46ed-8474-e027d6c28c1d-httpd-run\") pod \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.890986 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbf081e8-29d3-46ed-8474-e027d6c28c1d-scripts\") pod \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.891067 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.891165 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7pf5\" (UniqueName: \"kubernetes.io/projected/bbf081e8-29d3-46ed-8474-e027d6c28c1d-kube-api-access-g7pf5\") pod \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.891191 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbf081e8-29d3-46ed-8474-e027d6c28c1d-internal-tls-certs\") pod \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.891215 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbf081e8-29d3-46ed-8474-e027d6c28c1d-logs\") pod \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.891248 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf081e8-29d3-46ed-8474-e027d6c28c1d-combined-ca-bundle\") pod \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.891265 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf081e8-29d3-46ed-8474-e027d6c28c1d-config-data\") pod \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\" (UID: \"bbf081e8-29d3-46ed-8474-e027d6c28c1d\") " Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.891680 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbf081e8-29d3-46ed-8474-e027d6c28c1d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bbf081e8-29d3-46ed-8474-e027d6c28c1d" (UID: "bbf081e8-29d3-46ed-8474-e027d6c28c1d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.891788 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbf081e8-29d3-46ed-8474-e027d6c28c1d-logs" (OuterVolumeSpecName: "logs") pod "bbf081e8-29d3-46ed-8474-e027d6c28c1d" (UID: "bbf081e8-29d3-46ed-8474-e027d6c28c1d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.896472 4760 generic.go:334] "Generic (PLEG): container finished" podID="c3f46a87-7a28-45d7-ad4d-98b0ce508557" containerID="4a40042be6d960c45e1eefa8089a41c216ac486c34b1cb31e9864e2d389a219b" exitCode=0 Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.896692 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5c13e0bb-80b7-4846-b26c-5c60b234d6fa" containerName="cinder-scheduler" containerID="cri-o://b60c617b36bf4b007f796291f01c1bdc27578b9cf8b6758ecc2d7f62076b4a43" gracePeriod=30 Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.896980 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.897423 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" event={"ID":"c3f46a87-7a28-45d7-ad4d-98b0ce508557","Type":"ContainerDied","Data":"4a40042be6d960c45e1eefa8089a41c216ac486c34b1cb31e9864e2d389a219b"} Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.897681 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5c13e0bb-80b7-4846-b26c-5c60b234d6fa" containerName="probe" containerID="cri-o://3917dcda16f3ff66385559764fbb9f61e9e85cc4e3bdee25550c68b52bbb1056" gracePeriod=30 Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.897453 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" event={"ID":"c3f46a87-7a28-45d7-ad4d-98b0ce508557","Type":"ContainerDied","Data":"630c32481d53679d7f64e2c8b914c9022b001eb01a3a11b8609cb34fe07ec6a0"} Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.905113 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "bbf081e8-29d3-46ed-8474-e027d6c28c1d" (UID: "bbf081e8-29d3-46ed-8474-e027d6c28c1d"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.906478 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf081e8-29d3-46ed-8474-e027d6c28c1d-scripts" (OuterVolumeSpecName: "scripts") pod "bbf081e8-29d3-46ed-8474-e027d6c28c1d" (UID: "bbf081e8-29d3-46ed-8474-e027d6c28c1d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.906603 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbf081e8-29d3-46ed-8474-e027d6c28c1d-kube-api-access-g7pf5" (OuterVolumeSpecName: "kube-api-access-g7pf5") pod "bbf081e8-29d3-46ed-8474-e027d6c28c1d" (UID: "bbf081e8-29d3-46ed-8474-e027d6c28c1d"). InnerVolumeSpecName "kube-api-access-g7pf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.911397 4760 scope.go:117] "RemoveContainer" containerID="495694d991a90664ce4b5d0bd317dffe03e9480518f5da9230a6675bd99314b1" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.931965 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf081e8-29d3-46ed-8474-e027d6c28c1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbf081e8-29d3-46ed-8474-e027d6c28c1d" (UID: "bbf081e8-29d3-46ed-8474-e027d6c28c1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.951165 4760 scope.go:117] "RemoveContainer" containerID="15cfa7febc327d6ae0fa0a81ca83ab7a264a545a3590f385be1ee68176fe88f2" Sep 30 07:51:44 crc kubenswrapper[4760]: E0930 07:51:44.953121 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15cfa7febc327d6ae0fa0a81ca83ab7a264a545a3590f385be1ee68176fe88f2\": container with ID starting with 15cfa7febc327d6ae0fa0a81ca83ab7a264a545a3590f385be1ee68176fe88f2 not found: ID does not exist" containerID="15cfa7febc327d6ae0fa0a81ca83ab7a264a545a3590f385be1ee68176fe88f2" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.953427 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15cfa7febc327d6ae0fa0a81ca83ab7a264a545a3590f385be1ee68176fe88f2"} err="failed to get container status \"15cfa7febc327d6ae0fa0a81ca83ab7a264a545a3590f385be1ee68176fe88f2\": rpc error: code = NotFound desc = could not find container \"15cfa7febc327d6ae0fa0a81ca83ab7a264a545a3590f385be1ee68176fe88f2\": container with ID starting with 15cfa7febc327d6ae0fa0a81ca83ab7a264a545a3590f385be1ee68176fe88f2 not found: ID does not exist" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.953456 4760 scope.go:117] "RemoveContainer" containerID="495694d991a90664ce4b5d0bd317dffe03e9480518f5da9230a6675bd99314b1" Sep 30 07:51:44 crc kubenswrapper[4760]: E0930 07:51:44.953929 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"495694d991a90664ce4b5d0bd317dffe03e9480518f5da9230a6675bd99314b1\": container with ID starting with 495694d991a90664ce4b5d0bd317dffe03e9480518f5da9230a6675bd99314b1 not found: ID does not exist" containerID="495694d991a90664ce4b5d0bd317dffe03e9480518f5da9230a6675bd99314b1" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.953971 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"495694d991a90664ce4b5d0bd317dffe03e9480518f5da9230a6675bd99314b1"} err="failed to get container status \"495694d991a90664ce4b5d0bd317dffe03e9480518f5da9230a6675bd99314b1\": rpc error: code = NotFound desc = could not find container \"495694d991a90664ce4b5d0bd317dffe03e9480518f5da9230a6675bd99314b1\": container with ID starting with 495694d991a90664ce4b5d0bd317dffe03e9480518f5da9230a6675bd99314b1 not found: ID does not exist" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.953999 4760 scope.go:117] "RemoveContainer" containerID="4a40042be6d960c45e1eefa8089a41c216ac486c34b1cb31e9864e2d389a219b" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.961428 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf081e8-29d3-46ed-8474-e027d6c28c1d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bbf081e8-29d3-46ed-8474-e027d6c28c1d" (UID: "bbf081e8-29d3-46ed-8474-e027d6c28c1d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.970190 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf081e8-29d3-46ed-8474-e027d6c28c1d-config-data" (OuterVolumeSpecName: "config-data") pod "bbf081e8-29d3-46ed-8474-e027d6c28c1d" (UID: "bbf081e8-29d3-46ed-8474-e027d6c28c1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.983581 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.984201 4760 scope.go:117] "RemoveContainer" containerID="8b87c0f157d670eb6185eca30b743e8ba33ecf4a6f22073bd94584171aae30bd" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.993326 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz8sl\" (UniqueName: \"kubernetes.io/projected/c3f46a87-7a28-45d7-ad4d-98b0ce508557-kube-api-access-lz8sl\") pod \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\" (UID: \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\") " Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.993410 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-ovsdbserver-nb\") pod \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\" (UID: \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\") " Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.993502 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-ovsdbserver-sb\") pod \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\" (UID: \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\") " Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.993534 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-dns-swift-storage-0\") pod \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\" (UID: \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\") " Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.993561 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-dns-svc\") pod \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\" (UID: \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\") " Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.993698 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-config\") pod \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\" (UID: \"c3f46a87-7a28-45d7-ad4d-98b0ce508557\") " Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.994176 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf081e8-29d3-46ed-8474-e027d6c28c1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.994199 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf081e8-29d3-46ed-8474-e027d6c28c1d-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.994211 4760 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bbf081e8-29d3-46ed-8474-e027d6c28c1d-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.994222 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbf081e8-29d3-46ed-8474-e027d6c28c1d-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.994337 4760 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.994353 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7pf5\" (UniqueName: \"kubernetes.io/projected/bbf081e8-29d3-46ed-8474-e027d6c28c1d-kube-api-access-g7pf5\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.994366 4760 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbf081e8-29d3-46ed-8474-e027d6c28c1d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:44 crc kubenswrapper[4760]: I0930 07:51:44.994388 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbf081e8-29d3-46ed-8474-e027d6c28c1d-logs\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:45 crc kubenswrapper[4760]: W0930 07:51:44.995719 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19ddce20_85ae_4537_86f4_33a6b35fef0b.slice/crio-bac78037532adb8021f4640cdf757802c6b84f98514d16de20644193912329a2 WatchSource:0}: Error finding container bac78037532adb8021f4640cdf757802c6b84f98514d16de20644193912329a2: Status 404 returned error can't find the container with id bac78037532adb8021f4640cdf757802c6b84f98514d16de20644193912329a2 Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:44.998826 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3f46a87-7a28-45d7-ad4d-98b0ce508557-kube-api-access-lz8sl" (OuterVolumeSpecName: "kube-api-access-lz8sl") pod "c3f46a87-7a28-45d7-ad4d-98b0ce508557" (UID: "c3f46a87-7a28-45d7-ad4d-98b0ce508557"). InnerVolumeSpecName "kube-api-access-lz8sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.017619 4760 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.064483 4760 scope.go:117] "RemoveContainer" containerID="4a40042be6d960c45e1eefa8089a41c216ac486c34b1cb31e9864e2d389a219b" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.065024 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c3f46a87-7a28-45d7-ad4d-98b0ce508557" (UID: "c3f46a87-7a28-45d7-ad4d-98b0ce508557"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.065621 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c3f46a87-7a28-45d7-ad4d-98b0ce508557" (UID: "c3f46a87-7a28-45d7-ad4d-98b0ce508557"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:45 crc kubenswrapper[4760]: E0930 07:51:45.069452 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a40042be6d960c45e1eefa8089a41c216ac486c34b1cb31e9864e2d389a219b\": container with ID starting with 4a40042be6d960c45e1eefa8089a41c216ac486c34b1cb31e9864e2d389a219b not found: ID does not exist" containerID="4a40042be6d960c45e1eefa8089a41c216ac486c34b1cb31e9864e2d389a219b" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.069497 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a40042be6d960c45e1eefa8089a41c216ac486c34b1cb31e9864e2d389a219b"} err="failed to get container status \"4a40042be6d960c45e1eefa8089a41c216ac486c34b1cb31e9864e2d389a219b\": rpc error: code = NotFound desc = could not find container \"4a40042be6d960c45e1eefa8089a41c216ac486c34b1cb31e9864e2d389a219b\": container with ID starting with 4a40042be6d960c45e1eefa8089a41c216ac486c34b1cb31e9864e2d389a219b not found: ID does not exist" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.069525 4760 scope.go:117] "RemoveContainer" containerID="8b87c0f157d670eb6185eca30b743e8ba33ecf4a6f22073bd94584171aae30bd" Sep 30 07:51:45 crc kubenswrapper[4760]: E0930 07:51:45.071700 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b87c0f157d670eb6185eca30b743e8ba33ecf4a6f22073bd94584171aae30bd\": container with ID starting with 8b87c0f157d670eb6185eca30b743e8ba33ecf4a6f22073bd94584171aae30bd not found: ID does not exist" containerID="8b87c0f157d670eb6185eca30b743e8ba33ecf4a6f22073bd94584171aae30bd" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.071740 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b87c0f157d670eb6185eca30b743e8ba33ecf4a6f22073bd94584171aae30bd"} err="failed to get container status \"8b87c0f157d670eb6185eca30b743e8ba33ecf4a6f22073bd94584171aae30bd\": rpc error: code = NotFound desc = could not find container \"8b87c0f157d670eb6185eca30b743e8ba33ecf4a6f22073bd94584171aae30bd\": container with ID starting with 8b87c0f157d670eb6185eca30b743e8ba33ecf4a6f22073bd94584171aae30bd not found: ID does not exist" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.078259 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c3f46a87-7a28-45d7-ad4d-98b0ce508557" (UID: "c3f46a87-7a28-45d7-ad4d-98b0ce508557"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.084334 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dec298fa-4de5-4a26-bc21-409707df4ddb" path="/var/lib/kubelet/pods/dec298fa-4de5-4a26-bc21-409707df4ddb/volumes" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.090962 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c3f46a87-7a28-45d7-ad4d-98b0ce508557" (UID: "c3f46a87-7a28-45d7-ad4d-98b0ce508557"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.097078 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.097112 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.097126 4760 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.097137 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.097149 4760 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.097159 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz8sl\" (UniqueName: \"kubernetes.io/projected/c3f46a87-7a28-45d7-ad4d-98b0ce508557-kube-api-access-lz8sl\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.113245 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-config" (OuterVolumeSpecName: "config") pod "c3f46a87-7a28-45d7-ad4d-98b0ce508557" (UID: "c3f46a87-7a28-45d7-ad4d-98b0ce508557"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.206340 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3f46a87-7a28-45d7-ad4d-98b0ce508557-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.225344 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.242722 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.263092 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 07:51:45 crc kubenswrapper[4760]: E0930 07:51:45.263547 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf081e8-29d3-46ed-8474-e027d6c28c1d" containerName="glance-httpd" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.263566 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf081e8-29d3-46ed-8474-e027d6c28c1d" containerName="glance-httpd" Sep 30 07:51:45 crc kubenswrapper[4760]: E0930 07:51:45.263587 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3f46a87-7a28-45d7-ad4d-98b0ce508557" containerName="init" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.263593 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3f46a87-7a28-45d7-ad4d-98b0ce508557" containerName="init" Sep 30 07:51:45 crc kubenswrapper[4760]: E0930 07:51:45.263608 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf081e8-29d3-46ed-8474-e027d6c28c1d" containerName="glance-log" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.263615 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf081e8-29d3-46ed-8474-e027d6c28c1d" containerName="glance-log" Sep 30 07:51:45 crc kubenswrapper[4760]: E0930 07:51:45.263641 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3f46a87-7a28-45d7-ad4d-98b0ce508557" containerName="dnsmasq-dns" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.263646 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3f46a87-7a28-45d7-ad4d-98b0ce508557" containerName="dnsmasq-dns" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.263844 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbf081e8-29d3-46ed-8474-e027d6c28c1d" containerName="glance-httpd" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.263862 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3f46a87-7a28-45d7-ad4d-98b0ce508557" containerName="dnsmasq-dns" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.263874 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbf081e8-29d3-46ed-8474-e027d6c28c1d" containerName="glance-log" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.264870 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.269746 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.269958 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.274399 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.329648 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-2v8mj"] Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.360876 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-2v8mj"] Sep 30 07:51:45 crc kubenswrapper[4760]: E0930 07:51:45.411061 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3f46a87_7a28_45d7_ad4d_98b0ce508557.slice/crio-630c32481d53679d7f64e2c8b914c9022b001eb01a3a11b8609cb34fe07ec6a0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbf081e8_29d3_46ed_8474_e027d6c28c1d.slice/crio-0e948272245e33338d045716cc962b620641c70744289219859349abb8715afa\": RecentStats: unable to find data in memory cache]" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.424049 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eec5d0ef-04f3-4a34-8575-45e2a88c519f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eec5d0ef-04f3-4a34-8575-45e2a88c519f\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.424376 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"eec5d0ef-04f3-4a34-8575-45e2a88c519f\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.424546 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eec5d0ef-04f3-4a34-8575-45e2a88c519f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eec5d0ef-04f3-4a34-8575-45e2a88c519f\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.425414 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eec5d0ef-04f3-4a34-8575-45e2a88c519f-logs\") pod \"glance-default-internal-api-0\" (UID: \"eec5d0ef-04f3-4a34-8575-45e2a88c519f\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.425770 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eec5d0ef-04f3-4a34-8575-45e2a88c519f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eec5d0ef-04f3-4a34-8575-45e2a88c519f\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.425859 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eec5d0ef-04f3-4a34-8575-45e2a88c519f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eec5d0ef-04f3-4a34-8575-45e2a88c519f\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.426002 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5hzn\" (UniqueName: \"kubernetes.io/projected/eec5d0ef-04f3-4a34-8575-45e2a88c519f-kube-api-access-c5hzn\") pod \"glance-default-internal-api-0\" (UID: \"eec5d0ef-04f3-4a34-8575-45e2a88c519f\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.426578 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eec5d0ef-04f3-4a34-8575-45e2a88c519f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eec5d0ef-04f3-4a34-8575-45e2a88c519f\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.528454 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eec5d0ef-04f3-4a34-8575-45e2a88c519f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eec5d0ef-04f3-4a34-8575-45e2a88c519f\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.528906 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"eec5d0ef-04f3-4a34-8575-45e2a88c519f\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.529423 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eec5d0ef-04f3-4a34-8575-45e2a88c519f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eec5d0ef-04f3-4a34-8575-45e2a88c519f\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.529772 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eec5d0ef-04f3-4a34-8575-45e2a88c519f-logs\") pod \"glance-default-internal-api-0\" (UID: \"eec5d0ef-04f3-4a34-8575-45e2a88c519f\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.530378 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eec5d0ef-04f3-4a34-8575-45e2a88c519f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eec5d0ef-04f3-4a34-8575-45e2a88c519f\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.530464 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eec5d0ef-04f3-4a34-8575-45e2a88c519f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eec5d0ef-04f3-4a34-8575-45e2a88c519f\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.530541 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5hzn\" (UniqueName: \"kubernetes.io/projected/eec5d0ef-04f3-4a34-8575-45e2a88c519f-kube-api-access-c5hzn\") pod \"glance-default-internal-api-0\" (UID: \"eec5d0ef-04f3-4a34-8575-45e2a88c519f\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.530632 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eec5d0ef-04f3-4a34-8575-45e2a88c519f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eec5d0ef-04f3-4a34-8575-45e2a88c519f\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.530178 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eec5d0ef-04f3-4a34-8575-45e2a88c519f-logs\") pod \"glance-default-internal-api-0\" (UID: \"eec5d0ef-04f3-4a34-8575-45e2a88c519f\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.529325 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"eec5d0ef-04f3-4a34-8575-45e2a88c519f\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.532229 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eec5d0ef-04f3-4a34-8575-45e2a88c519f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eec5d0ef-04f3-4a34-8575-45e2a88c519f\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.533712 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eec5d0ef-04f3-4a34-8575-45e2a88c519f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eec5d0ef-04f3-4a34-8575-45e2a88c519f\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.537244 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eec5d0ef-04f3-4a34-8575-45e2a88c519f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eec5d0ef-04f3-4a34-8575-45e2a88c519f\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.537318 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eec5d0ef-04f3-4a34-8575-45e2a88c519f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eec5d0ef-04f3-4a34-8575-45e2a88c519f\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.543836 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eec5d0ef-04f3-4a34-8575-45e2a88c519f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eec5d0ef-04f3-4a34-8575-45e2a88c519f\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.561900 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5hzn\" (UniqueName: \"kubernetes.io/projected/eec5d0ef-04f3-4a34-8575-45e2a88c519f-kube-api-access-c5hzn\") pod \"glance-default-internal-api-0\" (UID: \"eec5d0ef-04f3-4a34-8575-45e2a88c519f\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.575130 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"eec5d0ef-04f3-4a34-8575-45e2a88c519f\") " pod="openstack/glance-default-internal-api-0" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.663621 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.926200 4760 generic.go:334] "Generic (PLEG): container finished" podID="5c13e0bb-80b7-4846-b26c-5c60b234d6fa" containerID="3917dcda16f3ff66385559764fbb9f61e9e85cc4e3bdee25550c68b52bbb1056" exitCode=0 Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.926650 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5c13e0bb-80b7-4846-b26c-5c60b234d6fa","Type":"ContainerDied","Data":"3917dcda16f3ff66385559764fbb9f61e9e85cc4e3bdee25550c68b52bbb1056"} Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.937521 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"19ddce20-85ae-4537-86f4-33a6b35fef0b","Type":"ContainerStarted","Data":"8092b0c084d52a9197c93f1be8226e72187a4eb72365d47b08edb237ed843131"} Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.937571 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"19ddce20-85ae-4537-86f4-33a6b35fef0b","Type":"ContainerStarted","Data":"bac78037532adb8021f4640cdf757802c6b84f98514d16de20644193912329a2"} Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.939702 4760 generic.go:334] "Generic (PLEG): container finished" podID="833abb8f-981c-489a-b60e-294479d780d8" containerID="6d119f473a0247e9326baf6deeae15b8826c596bae99eaa9c536db192617ec8b" exitCode=0 Sep 30 07:51:45 crc kubenswrapper[4760]: I0930 07:51:45.939745 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"833abb8f-981c-489a-b60e-294479d780d8","Type":"ContainerDied","Data":"6d119f473a0247e9326baf6deeae15b8826c596bae99eaa9c536db192617ec8b"} Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.220089 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.231000 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 07:51:46 crc kubenswrapper[4760]: W0930 07:51:46.241406 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeec5d0ef_04f3_4a34_8575_45e2a88c519f.slice/crio-c00f04ec2b6f8185650dcb3c76bc62e42e530e3ac03459f92ebc022b6eb03a97 WatchSource:0}: Error finding container c00f04ec2b6f8185650dcb3c76bc62e42e530e3ac03459f92ebc022b6eb03a97: Status 404 returned error can't find the container with id c00f04ec2b6f8185650dcb3c76bc62e42e530e3ac03459f92ebc022b6eb03a97 Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.346495 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/833abb8f-981c-489a-b60e-294479d780d8-scripts\") pod \"833abb8f-981c-489a-b60e-294479d780d8\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.346639 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/833abb8f-981c-489a-b60e-294479d780d8-config-data\") pod \"833abb8f-981c-489a-b60e-294479d780d8\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.346686 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/833abb8f-981c-489a-b60e-294479d780d8-log-httpd\") pod \"833abb8f-981c-489a-b60e-294479d780d8\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.346724 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/833abb8f-981c-489a-b60e-294479d780d8-combined-ca-bundle\") pod \"833abb8f-981c-489a-b60e-294479d780d8\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.346826 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npmbq\" (UniqueName: \"kubernetes.io/projected/833abb8f-981c-489a-b60e-294479d780d8-kube-api-access-npmbq\") pod \"833abb8f-981c-489a-b60e-294479d780d8\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.346843 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/833abb8f-981c-489a-b60e-294479d780d8-run-httpd\") pod \"833abb8f-981c-489a-b60e-294479d780d8\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.346897 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/833abb8f-981c-489a-b60e-294479d780d8-sg-core-conf-yaml\") pod \"833abb8f-981c-489a-b60e-294479d780d8\" (UID: \"833abb8f-981c-489a-b60e-294479d780d8\") " Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.347420 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/833abb8f-981c-489a-b60e-294479d780d8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "833abb8f-981c-489a-b60e-294479d780d8" (UID: "833abb8f-981c-489a-b60e-294479d780d8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.347725 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/833abb8f-981c-489a-b60e-294479d780d8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "833abb8f-981c-489a-b60e-294479d780d8" (UID: "833abb8f-981c-489a-b60e-294479d780d8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.352639 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/833abb8f-981c-489a-b60e-294479d780d8-kube-api-access-npmbq" (OuterVolumeSpecName: "kube-api-access-npmbq") pod "833abb8f-981c-489a-b60e-294479d780d8" (UID: "833abb8f-981c-489a-b60e-294479d780d8"). InnerVolumeSpecName "kube-api-access-npmbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.357532 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/833abb8f-981c-489a-b60e-294479d780d8-scripts" (OuterVolumeSpecName: "scripts") pod "833abb8f-981c-489a-b60e-294479d780d8" (UID: "833abb8f-981c-489a-b60e-294479d780d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.432699 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/833abb8f-981c-489a-b60e-294479d780d8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "833abb8f-981c-489a-b60e-294479d780d8" (UID: "833abb8f-981c-489a-b60e-294479d780d8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.449887 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/833abb8f-981c-489a-b60e-294479d780d8-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.449917 4760 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/833abb8f-981c-489a-b60e-294479d780d8-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.449926 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npmbq\" (UniqueName: \"kubernetes.io/projected/833abb8f-981c-489a-b60e-294479d780d8-kube-api-access-npmbq\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.449937 4760 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/833abb8f-981c-489a-b60e-294479d780d8-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.449946 4760 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/833abb8f-981c-489a-b60e-294479d780d8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.487061 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/833abb8f-981c-489a-b60e-294479d780d8-config-data" (OuterVolumeSpecName: "config-data") pod "833abb8f-981c-489a-b60e-294479d780d8" (UID: "833abb8f-981c-489a-b60e-294479d780d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.495390 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/833abb8f-981c-489a-b60e-294479d780d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "833abb8f-981c-489a-b60e-294479d780d8" (UID: "833abb8f-981c-489a-b60e-294479d780d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.518994 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.551410 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/833abb8f-981c-489a-b60e-294479d780d8-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.551441 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/833abb8f-981c-489a-b60e-294479d780d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.956035 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"833abb8f-981c-489a-b60e-294479d780d8","Type":"ContainerDied","Data":"4c7ecde806c27c25736f98a82044af1c94dc7c89ae6d8d636289b48b7c40b56b"} Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.956378 4760 scope.go:117] "RemoveContainer" containerID="127513d66ce615a75bedeef4b732dbc8ab907e782d9ee6481deee835a4d059cb" Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.956074 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.958871 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eec5d0ef-04f3-4a34-8575-45e2a88c519f","Type":"ContainerStarted","Data":"afb1b17229984258f3b8a201352fecbfb6118183e2c5a1bd9b5b9da1067056c5"} Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.958913 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eec5d0ef-04f3-4a34-8575-45e2a88c519f","Type":"ContainerStarted","Data":"c00f04ec2b6f8185650dcb3c76bc62e42e530e3ac03459f92ebc022b6eb03a97"} Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.962429 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"19ddce20-85ae-4537-86f4-33a6b35fef0b","Type":"ContainerStarted","Data":"91a5bf25840928abf4e7acd9a260ceb0979e2aecf439b0f6cf1ea0bd941207c3"} Sep 30 07:51:46 crc kubenswrapper[4760]: I0930 07:51:46.994384 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.994365171 podStartE2EDuration="3.994365171s" podCreationTimestamp="2025-09-30 07:51:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:51:46.983594967 +0000 UTC m=+1092.626501379" watchObservedRunningTime="2025-09-30 07:51:46.994365171 +0000 UTC m=+1092.637271583" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.006935 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.027687 4760 scope.go:117] "RemoveContainer" containerID="2683d62502143d345d3dd678c2041a0b4ce99f568b193663b2d524e00632e462" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.032685 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.042140 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:51:47 crc kubenswrapper[4760]: E0930 07:51:47.042580 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="833abb8f-981c-489a-b60e-294479d780d8" containerName="ceilometer-notification-agent" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.042592 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="833abb8f-981c-489a-b60e-294479d780d8" containerName="ceilometer-notification-agent" Sep 30 07:51:47 crc kubenswrapper[4760]: E0930 07:51:47.042619 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="833abb8f-981c-489a-b60e-294479d780d8" containerName="sg-core" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.042625 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="833abb8f-981c-489a-b60e-294479d780d8" containerName="sg-core" Sep 30 07:51:47 crc kubenswrapper[4760]: E0930 07:51:47.042635 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="833abb8f-981c-489a-b60e-294479d780d8" containerName="proxy-httpd" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.042643 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="833abb8f-981c-489a-b60e-294479d780d8" containerName="proxy-httpd" Sep 30 07:51:47 crc kubenswrapper[4760]: E0930 07:51:47.042667 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="833abb8f-981c-489a-b60e-294479d780d8" containerName="ceilometer-central-agent" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.042673 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="833abb8f-981c-489a-b60e-294479d780d8" containerName="ceilometer-central-agent" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.042829 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="833abb8f-981c-489a-b60e-294479d780d8" containerName="ceilometer-central-agent" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.042848 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="833abb8f-981c-489a-b60e-294479d780d8" containerName="sg-core" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.042865 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="833abb8f-981c-489a-b60e-294479d780d8" containerName="proxy-httpd" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.042873 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="833abb8f-981c-489a-b60e-294479d780d8" containerName="ceilometer-notification-agent" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.044534 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.048361 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.048706 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.055649 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.072380 4760 scope.go:117] "RemoveContainer" containerID="3c0bc4c2e06007b390ceb727d84382eba7463a9de49c4d1c74bb4f029f3e2c8f" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.081200 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="833abb8f-981c-489a-b60e-294479d780d8" path="/var/lib/kubelet/pods/833abb8f-981c-489a-b60e-294479d780d8/volumes" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.082364 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbf081e8-29d3-46ed-8474-e027d6c28c1d" path="/var/lib/kubelet/pods/bbf081e8-29d3-46ed-8474-e027d6c28c1d/volumes" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.083129 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3f46a87-7a28-45d7-ad4d-98b0ce508557" path="/var/lib/kubelet/pods/c3f46a87-7a28-45d7-ad4d-98b0ce508557/volumes" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.100560 4760 scope.go:117] "RemoveContainer" containerID="6d119f473a0247e9326baf6deeae15b8826c596bae99eaa9c536db192617ec8b" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.161627 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d410e17f-e92d-4b85-af7b-2d27431e4b75-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " pod="openstack/ceilometer-0" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.161679 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d410e17f-e92d-4b85-af7b-2d27431e4b75-log-httpd\") pod \"ceilometer-0\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " pod="openstack/ceilometer-0" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.161714 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d410e17f-e92d-4b85-af7b-2d27431e4b75-run-httpd\") pod \"ceilometer-0\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " pod="openstack/ceilometer-0" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.161737 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d410e17f-e92d-4b85-af7b-2d27431e4b75-config-data\") pod \"ceilometer-0\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " pod="openstack/ceilometer-0" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.162411 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hc64\" (UniqueName: \"kubernetes.io/projected/d410e17f-e92d-4b85-af7b-2d27431e4b75-kube-api-access-2hc64\") pod \"ceilometer-0\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " pod="openstack/ceilometer-0" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.162454 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d410e17f-e92d-4b85-af7b-2d27431e4b75-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " pod="openstack/ceilometer-0" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.162482 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d410e17f-e92d-4b85-af7b-2d27431e4b75-scripts\") pod \"ceilometer-0\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " pod="openstack/ceilometer-0" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.263748 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hc64\" (UniqueName: \"kubernetes.io/projected/d410e17f-e92d-4b85-af7b-2d27431e4b75-kube-api-access-2hc64\") pod \"ceilometer-0\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " pod="openstack/ceilometer-0" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.263797 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d410e17f-e92d-4b85-af7b-2d27431e4b75-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " pod="openstack/ceilometer-0" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.263833 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d410e17f-e92d-4b85-af7b-2d27431e4b75-scripts\") pod \"ceilometer-0\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " pod="openstack/ceilometer-0" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.263885 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d410e17f-e92d-4b85-af7b-2d27431e4b75-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " pod="openstack/ceilometer-0" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.263912 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d410e17f-e92d-4b85-af7b-2d27431e4b75-log-httpd\") pod \"ceilometer-0\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " pod="openstack/ceilometer-0" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.263940 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d410e17f-e92d-4b85-af7b-2d27431e4b75-run-httpd\") pod \"ceilometer-0\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " pod="openstack/ceilometer-0" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.263955 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d410e17f-e92d-4b85-af7b-2d27431e4b75-config-data\") pod \"ceilometer-0\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " pod="openstack/ceilometer-0" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.264596 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d410e17f-e92d-4b85-af7b-2d27431e4b75-log-httpd\") pod \"ceilometer-0\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " pod="openstack/ceilometer-0" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.264681 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d410e17f-e92d-4b85-af7b-2d27431e4b75-run-httpd\") pod \"ceilometer-0\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " pod="openstack/ceilometer-0" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.268798 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d410e17f-e92d-4b85-af7b-2d27431e4b75-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " pod="openstack/ceilometer-0" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.270634 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d410e17f-e92d-4b85-af7b-2d27431e4b75-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " pod="openstack/ceilometer-0" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.277969 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d410e17f-e92d-4b85-af7b-2d27431e4b75-scripts\") pod \"ceilometer-0\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " pod="openstack/ceilometer-0" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.277979 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d410e17f-e92d-4b85-af7b-2d27431e4b75-config-data\") pod \"ceilometer-0\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " pod="openstack/ceilometer-0" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.289704 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hc64\" (UniqueName: \"kubernetes.io/projected/d410e17f-e92d-4b85-af7b-2d27431e4b75-kube-api-access-2hc64\") pod \"ceilometer-0\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " pod="openstack/ceilometer-0" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.373989 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.783960 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.820379 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.974408 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d410e17f-e92d-4b85-af7b-2d27431e4b75","Type":"ContainerStarted","Data":"6fa6dd4fdc1f8e6a68bea1d00afc1668c115821b2bd1c4455a5277f57dfa2f08"} Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.978486 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eec5d0ef-04f3-4a34-8575-45e2a88c519f","Type":"ContainerStarted","Data":"5a3d6d07f0474a14be13e1bf17bfa0c3939c4e6df89ca18bbb0cb7e5f925dd93"} Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.980869 4760 generic.go:334] "Generic (PLEG): container finished" podID="5c13e0bb-80b7-4846-b26c-5c60b234d6fa" containerID="b60c617b36bf4b007f796291f01c1bdc27578b9cf8b6758ecc2d7f62076b4a43" exitCode=0 Sep 30 07:51:47 crc kubenswrapper[4760]: I0930 07:51:47.980908 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5c13e0bb-80b7-4846-b26c-5c60b234d6fa","Type":"ContainerDied","Data":"b60c617b36bf4b007f796291f01c1bdc27578b9cf8b6758ecc2d7f62076b4a43"} Sep 30 07:51:48 crc kubenswrapper[4760]: I0930 07:51:48.004021 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.004001485 podStartE2EDuration="3.004001485s" podCreationTimestamp="2025-09-30 07:51:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:51:47.995123498 +0000 UTC m=+1093.638029910" watchObservedRunningTime="2025-09-30 07:51:48.004001485 +0000 UTC m=+1093.646907897" Sep 30 07:51:48 crc kubenswrapper[4760]: I0930 07:51:48.162618 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 07:51:48 crc kubenswrapper[4760]: I0930 07:51:48.283918 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-etc-machine-id\") pod \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\" (UID: \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\") " Sep 30 07:51:48 crc kubenswrapper[4760]: I0930 07:51:48.283996 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-combined-ca-bundle\") pod \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\" (UID: \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\") " Sep 30 07:51:48 crc kubenswrapper[4760]: I0930 07:51:48.284039 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9gxf\" (UniqueName: \"kubernetes.io/projected/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-kube-api-access-b9gxf\") pod \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\" (UID: \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\") " Sep 30 07:51:48 crc kubenswrapper[4760]: I0930 07:51:48.284103 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-scripts\") pod \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\" (UID: \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\") " Sep 30 07:51:48 crc kubenswrapper[4760]: I0930 07:51:48.284114 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5c13e0bb-80b7-4846-b26c-5c60b234d6fa" (UID: "5c13e0bb-80b7-4846-b26c-5c60b234d6fa"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:51:48 crc kubenswrapper[4760]: I0930 07:51:48.284184 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-config-data-custom\") pod \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\" (UID: \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\") " Sep 30 07:51:48 crc kubenswrapper[4760]: I0930 07:51:48.284229 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-config-data\") pod \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\" (UID: \"5c13e0bb-80b7-4846-b26c-5c60b234d6fa\") " Sep 30 07:51:48 crc kubenswrapper[4760]: I0930 07:51:48.284575 4760 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:48 crc kubenswrapper[4760]: I0930 07:51:48.294222 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5c13e0bb-80b7-4846-b26c-5c60b234d6fa" (UID: "5c13e0bb-80b7-4846-b26c-5c60b234d6fa"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:48 crc kubenswrapper[4760]: I0930 07:51:48.294345 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-scripts" (OuterVolumeSpecName: "scripts") pod "5c13e0bb-80b7-4846-b26c-5c60b234d6fa" (UID: "5c13e0bb-80b7-4846-b26c-5c60b234d6fa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:48 crc kubenswrapper[4760]: I0930 07:51:48.295484 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-kube-api-access-b9gxf" (OuterVolumeSpecName: "kube-api-access-b9gxf") pod "5c13e0bb-80b7-4846-b26c-5c60b234d6fa" (UID: "5c13e0bb-80b7-4846-b26c-5c60b234d6fa"). InnerVolumeSpecName "kube-api-access-b9gxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:51:48 crc kubenswrapper[4760]: I0930 07:51:48.342168 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c13e0bb-80b7-4846-b26c-5c60b234d6fa" (UID: "5c13e0bb-80b7-4846-b26c-5c60b234d6fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:48 crc kubenswrapper[4760]: I0930 07:51:48.386159 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:48 crc kubenswrapper[4760]: I0930 07:51:48.386190 4760 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:48 crc kubenswrapper[4760]: I0930 07:51:48.386202 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:48 crc kubenswrapper[4760]: I0930 07:51:48.386213 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9gxf\" (UniqueName: \"kubernetes.io/projected/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-kube-api-access-b9gxf\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:48 crc kubenswrapper[4760]: I0930 07:51:48.399953 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-config-data" (OuterVolumeSpecName: "config-data") pod "5c13e0bb-80b7-4846-b26c-5c60b234d6fa" (UID: "5c13e0bb-80b7-4846-b26c-5c60b234d6fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:48 crc kubenswrapper[4760]: I0930 07:51:48.488628 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c13e0bb-80b7-4846-b26c-5c60b234d6fa-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:48 crc kubenswrapper[4760]: I0930 07:51:48.995148 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 07:51:48 crc kubenswrapper[4760]: I0930 07:51:48.995321 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5c13e0bb-80b7-4846-b26c-5c60b234d6fa","Type":"ContainerDied","Data":"fe2eccf082a658deeb7f89306c40e01fb0ae73a5504dff438b4b149a18384cd7"} Sep 30 07:51:48 crc kubenswrapper[4760]: I0930 07:51:48.997452 4760 scope.go:117] "RemoveContainer" containerID="3917dcda16f3ff66385559764fbb9f61e9e85cc4e3bdee25550c68b52bbb1056" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.031672 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.048332 4760 scope.go:117] "RemoveContainer" containerID="b60c617b36bf4b007f796291f01c1bdc27578b9cf8b6758ecc2d7f62076b4a43" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.052259 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.063941 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 07:51:49 crc kubenswrapper[4760]: E0930 07:51:49.064526 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c13e0bb-80b7-4846-b26c-5c60b234d6fa" containerName="cinder-scheduler" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.064544 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c13e0bb-80b7-4846-b26c-5c60b234d6fa" containerName="cinder-scheduler" Sep 30 07:51:49 crc kubenswrapper[4760]: E0930 07:51:49.064556 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c13e0bb-80b7-4846-b26c-5c60b234d6fa" containerName="probe" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.064564 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c13e0bb-80b7-4846-b26c-5c60b234d6fa" containerName="probe" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.064789 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c13e0bb-80b7-4846-b26c-5c60b234d6fa" containerName="probe" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.064808 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c13e0bb-80b7-4846-b26c-5c60b234d6fa" containerName="cinder-scheduler" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.066074 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.069593 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.085562 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c13e0bb-80b7-4846-b26c-5c60b234d6fa" path="/var/lib/kubelet/pods/5c13e0bb-80b7-4846-b26c-5c60b234d6fa/volumes" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.086539 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.112766 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.112820 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.202137 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bbcd\" (UniqueName: \"kubernetes.io/projected/d3817395-40ad-472b-b4df-83a7386bb16f-kube-api-access-5bbcd\") pod \"cinder-scheduler-0\" (UID: \"d3817395-40ad-472b-b4df-83a7386bb16f\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.202216 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3817395-40ad-472b-b4df-83a7386bb16f-config-data\") pod \"cinder-scheduler-0\" (UID: \"d3817395-40ad-472b-b4df-83a7386bb16f\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.202236 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3817395-40ad-472b-b4df-83a7386bb16f-scripts\") pod \"cinder-scheduler-0\" (UID: \"d3817395-40ad-472b-b4df-83a7386bb16f\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.202342 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3817395-40ad-472b-b4df-83a7386bb16f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d3817395-40ad-472b-b4df-83a7386bb16f\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.202569 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3817395-40ad-472b-b4df-83a7386bb16f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d3817395-40ad-472b-b4df-83a7386bb16f\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.202657 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3817395-40ad-472b-b4df-83a7386bb16f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d3817395-40ad-472b-b4df-83a7386bb16f\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.304460 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3817395-40ad-472b-b4df-83a7386bb16f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d3817395-40ad-472b-b4df-83a7386bb16f\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.304531 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3817395-40ad-472b-b4df-83a7386bb16f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d3817395-40ad-472b-b4df-83a7386bb16f\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.304636 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bbcd\" (UniqueName: \"kubernetes.io/projected/d3817395-40ad-472b-b4df-83a7386bb16f-kube-api-access-5bbcd\") pod \"cinder-scheduler-0\" (UID: \"d3817395-40ad-472b-b4df-83a7386bb16f\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.304648 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3817395-40ad-472b-b4df-83a7386bb16f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d3817395-40ad-472b-b4df-83a7386bb16f\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.304688 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3817395-40ad-472b-b4df-83a7386bb16f-config-data\") pod \"cinder-scheduler-0\" (UID: \"d3817395-40ad-472b-b4df-83a7386bb16f\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.304768 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3817395-40ad-472b-b4df-83a7386bb16f-scripts\") pod \"cinder-scheduler-0\" (UID: \"d3817395-40ad-472b-b4df-83a7386bb16f\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.304924 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3817395-40ad-472b-b4df-83a7386bb16f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d3817395-40ad-472b-b4df-83a7386bb16f\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.310614 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3817395-40ad-472b-b4df-83a7386bb16f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d3817395-40ad-472b-b4df-83a7386bb16f\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.310794 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3817395-40ad-472b-b4df-83a7386bb16f-scripts\") pod \"cinder-scheduler-0\" (UID: \"d3817395-40ad-472b-b4df-83a7386bb16f\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.311130 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3817395-40ad-472b-b4df-83a7386bb16f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d3817395-40ad-472b-b4df-83a7386bb16f\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.311604 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3817395-40ad-472b-b4df-83a7386bb16f-config-data\") pod \"cinder-scheduler-0\" (UID: \"d3817395-40ad-472b-b4df-83a7386bb16f\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.334792 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bbcd\" (UniqueName: \"kubernetes.io/projected/d3817395-40ad-472b-b4df-83a7386bb16f-kube-api-access-5bbcd\") pod \"cinder-scheduler-0\" (UID: \"d3817395-40ad-472b-b4df-83a7386bb16f\") " pod="openstack/cinder-scheduler-0" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.393167 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.615050 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-688c87cc99-2v8mj" podUID="c3f46a87-7a28-45d7-ad4d-98b0ce508557" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.176:5353: i/o timeout" Sep 30 07:51:49 crc kubenswrapper[4760]: I0930 07:51:49.886178 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 07:51:50 crc kubenswrapper[4760]: I0930 07:51:50.015915 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d410e17f-e92d-4b85-af7b-2d27431e4b75","Type":"ContainerStarted","Data":"0d4451fa8b93914d9523781207c03908e8c85a996636fd66bf9466a8c552cedb"} Sep 30 07:51:50 crc kubenswrapper[4760]: I0930 07:51:50.015960 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d410e17f-e92d-4b85-af7b-2d27431e4b75","Type":"ContainerStarted","Data":"a90a802e56eee080de1a520ad9bed255f68e7840b50fe4c20f1006ef90e84d41"} Sep 30 07:51:50 crc kubenswrapper[4760]: I0930 07:51:50.019387 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d3817395-40ad-472b-b4df-83a7386bb16f","Type":"ContainerStarted","Data":"a076f647b3961b3d88168e71b35b8862428866c90cd1fc706962aea56a931a6d"} Sep 30 07:51:50 crc kubenswrapper[4760]: I0930 07:51:50.735992 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-7b12-account-create-59swt"] Sep 30 07:51:50 crc kubenswrapper[4760]: I0930 07:51:50.737992 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7b12-account-create-59swt" Sep 30 07:51:50 crc kubenswrapper[4760]: I0930 07:51:50.741538 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Sep 30 07:51:50 crc kubenswrapper[4760]: I0930 07:51:50.743219 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-7b12-account-create-59swt"] Sep 30 07:51:50 crc kubenswrapper[4760]: I0930 07:51:50.844492 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22lgc\" (UniqueName: \"kubernetes.io/projected/bb3cbe0b-da3a-48ff-8181-68a3fb8b9db5-kube-api-access-22lgc\") pod \"nova-api-7b12-account-create-59swt\" (UID: \"bb3cbe0b-da3a-48ff-8181-68a3fb8b9db5\") " pod="openstack/nova-api-7b12-account-create-59swt" Sep 30 07:51:50 crc kubenswrapper[4760]: I0930 07:51:50.933940 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-8b87-account-create-gpd5x"] Sep 30 07:51:50 crc kubenswrapper[4760]: I0930 07:51:50.935404 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8b87-account-create-gpd5x" Sep 30 07:51:50 crc kubenswrapper[4760]: I0930 07:51:50.942662 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Sep 30 07:51:50 crc kubenswrapper[4760]: I0930 07:51:50.945104 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8b87-account-create-gpd5x"] Sep 30 07:51:50 crc kubenswrapper[4760]: I0930 07:51:50.946107 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22lgc\" (UniqueName: \"kubernetes.io/projected/bb3cbe0b-da3a-48ff-8181-68a3fb8b9db5-kube-api-access-22lgc\") pod \"nova-api-7b12-account-create-59swt\" (UID: \"bb3cbe0b-da3a-48ff-8181-68a3fb8b9db5\") " pod="openstack/nova-api-7b12-account-create-59swt" Sep 30 07:51:50 crc kubenswrapper[4760]: I0930 07:51:50.964684 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22lgc\" (UniqueName: \"kubernetes.io/projected/bb3cbe0b-da3a-48ff-8181-68a3fb8b9db5-kube-api-access-22lgc\") pod \"nova-api-7b12-account-create-59swt\" (UID: \"bb3cbe0b-da3a-48ff-8181-68a3fb8b9db5\") " pod="openstack/nova-api-7b12-account-create-59swt" Sep 30 07:51:51 crc kubenswrapper[4760]: I0930 07:51:51.035020 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d410e17f-e92d-4b85-af7b-2d27431e4b75","Type":"ContainerStarted","Data":"e481f4caccd87adea3b8c6b1aba73eda39dcfe230bef2c73a5a5f95a1a236567"} Sep 30 07:51:51 crc kubenswrapper[4760]: I0930 07:51:51.036681 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d3817395-40ad-472b-b4df-83a7386bb16f","Type":"ContainerStarted","Data":"37d217051e145cd160c42d10fdcbf95498a9269e083178cbb66d0a6b92df73dc"} Sep 30 07:51:51 crc kubenswrapper[4760]: I0930 07:51:51.047429 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4mrd\" (UniqueName: \"kubernetes.io/projected/d7cb900f-4fea-42ed-a186-c173a16463b6-kube-api-access-l4mrd\") pod \"nova-cell0-8b87-account-create-gpd5x\" (UID: \"d7cb900f-4fea-42ed-a186-c173a16463b6\") " pod="openstack/nova-cell0-8b87-account-create-gpd5x" Sep 30 07:51:51 crc kubenswrapper[4760]: I0930 07:51:51.064474 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7b12-account-create-59swt" Sep 30 07:51:51 crc kubenswrapper[4760]: I0930 07:51:51.134158 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-7b10-account-create-nhsdp"] Sep 30 07:51:51 crc kubenswrapper[4760]: I0930 07:51:51.135896 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7b10-account-create-nhsdp" Sep 30 07:51:51 crc kubenswrapper[4760]: I0930 07:51:51.138598 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Sep 30 07:51:51 crc kubenswrapper[4760]: I0930 07:51:51.145816 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-7b10-account-create-nhsdp"] Sep 30 07:51:51 crc kubenswrapper[4760]: I0930 07:51:51.154736 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4mrd\" (UniqueName: \"kubernetes.io/projected/d7cb900f-4fea-42ed-a186-c173a16463b6-kube-api-access-l4mrd\") pod \"nova-cell0-8b87-account-create-gpd5x\" (UID: \"d7cb900f-4fea-42ed-a186-c173a16463b6\") " pod="openstack/nova-cell0-8b87-account-create-gpd5x" Sep 30 07:51:51 crc kubenswrapper[4760]: I0930 07:51:51.171170 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4mrd\" (UniqueName: \"kubernetes.io/projected/d7cb900f-4fea-42ed-a186-c173a16463b6-kube-api-access-l4mrd\") pod \"nova-cell0-8b87-account-create-gpd5x\" (UID: \"d7cb900f-4fea-42ed-a186-c173a16463b6\") " pod="openstack/nova-cell0-8b87-account-create-gpd5x" Sep 30 07:51:51 crc kubenswrapper[4760]: I0930 07:51:51.257344 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx62x\" (UniqueName: \"kubernetes.io/projected/d94a6d3c-5790-45ac-b75e-02cf0defd846-kube-api-access-xx62x\") pod \"nova-cell1-7b10-account-create-nhsdp\" (UID: \"d94a6d3c-5790-45ac-b75e-02cf0defd846\") " pod="openstack/nova-cell1-7b10-account-create-nhsdp" Sep 30 07:51:51 crc kubenswrapper[4760]: I0930 07:51:51.261293 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8b87-account-create-gpd5x" Sep 30 07:51:51 crc kubenswrapper[4760]: I0930 07:51:51.359347 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx62x\" (UniqueName: \"kubernetes.io/projected/d94a6d3c-5790-45ac-b75e-02cf0defd846-kube-api-access-xx62x\") pod \"nova-cell1-7b10-account-create-nhsdp\" (UID: \"d94a6d3c-5790-45ac-b75e-02cf0defd846\") " pod="openstack/nova-cell1-7b10-account-create-nhsdp" Sep 30 07:51:51 crc kubenswrapper[4760]: I0930 07:51:51.382972 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx62x\" (UniqueName: \"kubernetes.io/projected/d94a6d3c-5790-45ac-b75e-02cf0defd846-kube-api-access-xx62x\") pod \"nova-cell1-7b10-account-create-nhsdp\" (UID: \"d94a6d3c-5790-45ac-b75e-02cf0defd846\") " pod="openstack/nova-cell1-7b10-account-create-nhsdp" Sep 30 07:51:51 crc kubenswrapper[4760]: I0930 07:51:51.462235 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7b10-account-create-nhsdp" Sep 30 07:51:51 crc kubenswrapper[4760]: I0930 07:51:51.584477 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-7b12-account-create-59swt"] Sep 30 07:51:51 crc kubenswrapper[4760]: W0930 07:51:51.604465 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb3cbe0b_da3a_48ff_8181_68a3fb8b9db5.slice/crio-62f7ccdb79741ba79cc90b55867c7c9f31872b2b95820d08f10ccf6247ef7157 WatchSource:0}: Error finding container 62f7ccdb79741ba79cc90b55867c7c9f31872b2b95820d08f10ccf6247ef7157: Status 404 returned error can't find the container with id 62f7ccdb79741ba79cc90b55867c7c9f31872b2b95820d08f10ccf6247ef7157 Sep 30 07:51:51 crc kubenswrapper[4760]: I0930 07:51:51.720811 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8b87-account-create-gpd5x"] Sep 30 07:51:51 crc kubenswrapper[4760]: I0930 07:51:51.923050 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-7b10-account-create-nhsdp"] Sep 30 07:51:52 crc kubenswrapper[4760]: I0930 07:51:52.048395 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d3817395-40ad-472b-b4df-83a7386bb16f","Type":"ContainerStarted","Data":"bffa484a6a1f13c51b0e246319730508169b02b3f6e575bb5da5783abc0cb9a6"} Sep 30 07:51:52 crc kubenswrapper[4760]: I0930 07:51:52.049746 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7b10-account-create-nhsdp" event={"ID":"d94a6d3c-5790-45ac-b75e-02cf0defd846","Type":"ContainerStarted","Data":"fa634215ad7c2e04ad575d444199f0ac82e59d9bafbe46af608e480027621386"} Sep 30 07:51:52 crc kubenswrapper[4760]: I0930 07:51:52.051272 4760 generic.go:334] "Generic (PLEG): container finished" podID="d7cb900f-4fea-42ed-a186-c173a16463b6" containerID="ec711e8c3a7507a28e3a6d934552ff9831df2b9170f1ca7e50df57208ddf6955" exitCode=0 Sep 30 07:51:52 crc kubenswrapper[4760]: I0930 07:51:52.051437 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8b87-account-create-gpd5x" event={"ID":"d7cb900f-4fea-42ed-a186-c173a16463b6","Type":"ContainerDied","Data":"ec711e8c3a7507a28e3a6d934552ff9831df2b9170f1ca7e50df57208ddf6955"} Sep 30 07:51:52 crc kubenswrapper[4760]: I0930 07:51:52.051488 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8b87-account-create-gpd5x" event={"ID":"d7cb900f-4fea-42ed-a186-c173a16463b6","Type":"ContainerStarted","Data":"00a7f9b2e9ca22a04dfbefc171660ebfdb61bf020121a8b376adf5c0958e64ef"} Sep 30 07:51:52 crc kubenswrapper[4760]: I0930 07:51:52.054043 4760 generic.go:334] "Generic (PLEG): container finished" podID="bb3cbe0b-da3a-48ff-8181-68a3fb8b9db5" containerID="2f93dcb2a5671b6b0fc5400122ea9d2c23b57f214cd9ec9ae187a377b8d40206" exitCode=0 Sep 30 07:51:52 crc kubenswrapper[4760]: I0930 07:51:52.054084 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7b12-account-create-59swt" event={"ID":"bb3cbe0b-da3a-48ff-8181-68a3fb8b9db5","Type":"ContainerDied","Data":"2f93dcb2a5671b6b0fc5400122ea9d2c23b57f214cd9ec9ae187a377b8d40206"} Sep 30 07:51:52 crc kubenswrapper[4760]: I0930 07:51:52.054113 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7b12-account-create-59swt" event={"ID":"bb3cbe0b-da3a-48ff-8181-68a3fb8b9db5","Type":"ContainerStarted","Data":"62f7ccdb79741ba79cc90b55867c7c9f31872b2b95820d08f10ccf6247ef7157"} Sep 30 07:51:52 crc kubenswrapper[4760]: I0930 07:51:52.095435 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.095415004 podStartE2EDuration="3.095415004s" podCreationTimestamp="2025-09-30 07:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:51:52.070684764 +0000 UTC m=+1097.713591176" watchObservedRunningTime="2025-09-30 07:51:52.095415004 +0000 UTC m=+1097.738321426" Sep 30 07:51:53 crc kubenswrapper[4760]: I0930 07:51:53.070823 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d410e17f-e92d-4b85-af7b-2d27431e4b75" containerName="ceilometer-central-agent" containerID="cri-o://a90a802e56eee080de1a520ad9bed255f68e7840b50fe4c20f1006ef90e84d41" gracePeriod=30 Sep 30 07:51:53 crc kubenswrapper[4760]: I0930 07:51:53.071748 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d410e17f-e92d-4b85-af7b-2d27431e4b75" containerName="proxy-httpd" containerID="cri-o://302c476965442aba60f1695567d4b8cbcfb43f5fa7c37b171d3ae7de291f862d" gracePeriod=30 Sep 30 07:51:53 crc kubenswrapper[4760]: I0930 07:51:53.071771 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d410e17f-e92d-4b85-af7b-2d27431e4b75" containerName="sg-core" containerID="cri-o://e481f4caccd87adea3b8c6b1aba73eda39dcfe230bef2c73a5a5f95a1a236567" gracePeriod=30 Sep 30 07:51:53 crc kubenswrapper[4760]: I0930 07:51:53.071785 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d410e17f-e92d-4b85-af7b-2d27431e4b75" containerName="ceilometer-notification-agent" containerID="cri-o://0d4451fa8b93914d9523781207c03908e8c85a996636fd66bf9466a8c552cedb" gracePeriod=30 Sep 30 07:51:53 crc kubenswrapper[4760]: I0930 07:51:53.082754 4760 generic.go:334] "Generic (PLEG): container finished" podID="d94a6d3c-5790-45ac-b75e-02cf0defd846" containerID="1d1c275a662b02cb610bdf134f74cd28cfc1debbed525b5ec2ee60f0662b2604" exitCode=0 Sep 30 07:51:53 crc kubenswrapper[4760]: I0930 07:51:53.090582 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 07:51:53 crc kubenswrapper[4760]: I0930 07:51:53.095057 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d410e17f-e92d-4b85-af7b-2d27431e4b75","Type":"ContainerStarted","Data":"302c476965442aba60f1695567d4b8cbcfb43f5fa7c37b171d3ae7de291f862d"} Sep 30 07:51:53 crc kubenswrapper[4760]: I0930 07:51:53.095209 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7b10-account-create-nhsdp" event={"ID":"d94a6d3c-5790-45ac-b75e-02cf0defd846","Type":"ContainerDied","Data":"1d1c275a662b02cb610bdf134f74cd28cfc1debbed525b5ec2ee60f0662b2604"} Sep 30 07:51:53 crc kubenswrapper[4760]: I0930 07:51:53.117109 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.869544847 podStartE2EDuration="6.117090414s" podCreationTimestamp="2025-09-30 07:51:47 +0000 UTC" firstStartedPulling="2025-09-30 07:51:47.845794424 +0000 UTC m=+1093.488700836" lastFinishedPulling="2025-09-30 07:51:52.093339991 +0000 UTC m=+1097.736246403" observedRunningTime="2025-09-30 07:51:53.095596027 +0000 UTC m=+1098.738502439" watchObservedRunningTime="2025-09-30 07:51:53.117090414 +0000 UTC m=+1098.759996826" Sep 30 07:51:53 crc kubenswrapper[4760]: I0930 07:51:53.631872 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7b12-account-create-59swt" Sep 30 07:51:53 crc kubenswrapper[4760]: I0930 07:51:53.641495 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8b87-account-create-gpd5x" Sep 30 07:51:53 crc kubenswrapper[4760]: I0930 07:51:53.729134 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4mrd\" (UniqueName: \"kubernetes.io/projected/d7cb900f-4fea-42ed-a186-c173a16463b6-kube-api-access-l4mrd\") pod \"d7cb900f-4fea-42ed-a186-c173a16463b6\" (UID: \"d7cb900f-4fea-42ed-a186-c173a16463b6\") " Sep 30 07:51:53 crc kubenswrapper[4760]: I0930 07:51:53.729236 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22lgc\" (UniqueName: \"kubernetes.io/projected/bb3cbe0b-da3a-48ff-8181-68a3fb8b9db5-kube-api-access-22lgc\") pod \"bb3cbe0b-da3a-48ff-8181-68a3fb8b9db5\" (UID: \"bb3cbe0b-da3a-48ff-8181-68a3fb8b9db5\") " Sep 30 07:51:53 crc kubenswrapper[4760]: I0930 07:51:53.735129 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7cb900f-4fea-42ed-a186-c173a16463b6-kube-api-access-l4mrd" (OuterVolumeSpecName: "kube-api-access-l4mrd") pod "d7cb900f-4fea-42ed-a186-c173a16463b6" (UID: "d7cb900f-4fea-42ed-a186-c173a16463b6"). InnerVolumeSpecName "kube-api-access-l4mrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:51:53 crc kubenswrapper[4760]: I0930 07:51:53.735155 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb3cbe0b-da3a-48ff-8181-68a3fb8b9db5-kube-api-access-22lgc" (OuterVolumeSpecName: "kube-api-access-22lgc") pod "bb3cbe0b-da3a-48ff-8181-68a3fb8b9db5" (UID: "bb3cbe0b-da3a-48ff-8181-68a3fb8b9db5"). InnerVolumeSpecName "kube-api-access-22lgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:51:53 crc kubenswrapper[4760]: I0930 07:51:53.831628 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4mrd\" (UniqueName: \"kubernetes.io/projected/d7cb900f-4fea-42ed-a186-c173a16463b6-kube-api-access-l4mrd\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:53 crc kubenswrapper[4760]: I0930 07:51:53.831683 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22lgc\" (UniqueName: \"kubernetes.io/projected/bb3cbe0b-da3a-48ff-8181-68a3fb8b9db5-kube-api-access-22lgc\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.095901 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8b87-account-create-gpd5x" event={"ID":"d7cb900f-4fea-42ed-a186-c173a16463b6","Type":"ContainerDied","Data":"00a7f9b2e9ca22a04dfbefc171660ebfdb61bf020121a8b376adf5c0958e64ef"} Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.095938 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00a7f9b2e9ca22a04dfbefc171660ebfdb61bf020121a8b376adf5c0958e64ef" Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.095937 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8b87-account-create-gpd5x" Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.098937 4760 generic.go:334] "Generic (PLEG): container finished" podID="d410e17f-e92d-4b85-af7b-2d27431e4b75" containerID="302c476965442aba60f1695567d4b8cbcfb43f5fa7c37b171d3ae7de291f862d" exitCode=0 Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.098981 4760 generic.go:334] "Generic (PLEG): container finished" podID="d410e17f-e92d-4b85-af7b-2d27431e4b75" containerID="e481f4caccd87adea3b8c6b1aba73eda39dcfe230bef2c73a5a5f95a1a236567" exitCode=2 Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.098992 4760 generic.go:334] "Generic (PLEG): container finished" podID="d410e17f-e92d-4b85-af7b-2d27431e4b75" containerID="0d4451fa8b93914d9523781207c03908e8c85a996636fd66bf9466a8c552cedb" exitCode=0 Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.099035 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d410e17f-e92d-4b85-af7b-2d27431e4b75","Type":"ContainerDied","Data":"302c476965442aba60f1695567d4b8cbcfb43f5fa7c37b171d3ae7de291f862d"} Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.099074 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d410e17f-e92d-4b85-af7b-2d27431e4b75","Type":"ContainerDied","Data":"e481f4caccd87adea3b8c6b1aba73eda39dcfe230bef2c73a5a5f95a1a236567"} Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.099088 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d410e17f-e92d-4b85-af7b-2d27431e4b75","Type":"ContainerDied","Data":"0d4451fa8b93914d9523781207c03908e8c85a996636fd66bf9466a8c552cedb"} Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.101568 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7b12-account-create-59swt" Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.109410 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7b12-account-create-59swt" event={"ID":"bb3cbe0b-da3a-48ff-8181-68a3fb8b9db5","Type":"ContainerDied","Data":"62f7ccdb79741ba79cc90b55867c7c9f31872b2b95820d08f10ccf6247ef7157"} Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.109482 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62f7ccdb79741ba79cc90b55867c7c9f31872b2b95820d08f10ccf6247ef7157" Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.281380 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.281458 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.328884 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.351323 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.394237 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.504262 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7b10-account-create-nhsdp" Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.648695 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx62x\" (UniqueName: \"kubernetes.io/projected/d94a6d3c-5790-45ac-b75e-02cf0defd846-kube-api-access-xx62x\") pod \"d94a6d3c-5790-45ac-b75e-02cf0defd846\" (UID: \"d94a6d3c-5790-45ac-b75e-02cf0defd846\") " Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.660745 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d94a6d3c-5790-45ac-b75e-02cf0defd846-kube-api-access-xx62x" (OuterVolumeSpecName: "kube-api-access-xx62x") pod "d94a6d3c-5790-45ac-b75e-02cf0defd846" (UID: "d94a6d3c-5790-45ac-b75e-02cf0defd846"). InnerVolumeSpecName "kube-api-access-xx62x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.752533 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx62x\" (UniqueName: \"kubernetes.io/projected/d94a6d3c-5790-45ac-b75e-02cf0defd846-kube-api-access-xx62x\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.761193 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.853959 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d410e17f-e92d-4b85-af7b-2d27431e4b75-run-httpd\") pod \"d410e17f-e92d-4b85-af7b-2d27431e4b75\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.854101 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d410e17f-e92d-4b85-af7b-2d27431e4b75-sg-core-conf-yaml\") pod \"d410e17f-e92d-4b85-af7b-2d27431e4b75\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.854173 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d410e17f-e92d-4b85-af7b-2d27431e4b75-log-httpd\") pod \"d410e17f-e92d-4b85-af7b-2d27431e4b75\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.854193 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hc64\" (UniqueName: \"kubernetes.io/projected/d410e17f-e92d-4b85-af7b-2d27431e4b75-kube-api-access-2hc64\") pod \"d410e17f-e92d-4b85-af7b-2d27431e4b75\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.854249 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d410e17f-e92d-4b85-af7b-2d27431e4b75-combined-ca-bundle\") pod \"d410e17f-e92d-4b85-af7b-2d27431e4b75\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.854327 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d410e17f-e92d-4b85-af7b-2d27431e4b75-config-data\") pod \"d410e17f-e92d-4b85-af7b-2d27431e4b75\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.854372 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d410e17f-e92d-4b85-af7b-2d27431e4b75-scripts\") pod \"d410e17f-e92d-4b85-af7b-2d27431e4b75\" (UID: \"d410e17f-e92d-4b85-af7b-2d27431e4b75\") " Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.854875 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d410e17f-e92d-4b85-af7b-2d27431e4b75-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d410e17f-e92d-4b85-af7b-2d27431e4b75" (UID: "d410e17f-e92d-4b85-af7b-2d27431e4b75"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.855005 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d410e17f-e92d-4b85-af7b-2d27431e4b75-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d410e17f-e92d-4b85-af7b-2d27431e4b75" (UID: "d410e17f-e92d-4b85-af7b-2d27431e4b75"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.858265 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d410e17f-e92d-4b85-af7b-2d27431e4b75-kube-api-access-2hc64" (OuterVolumeSpecName: "kube-api-access-2hc64") pod "d410e17f-e92d-4b85-af7b-2d27431e4b75" (UID: "d410e17f-e92d-4b85-af7b-2d27431e4b75"). InnerVolumeSpecName "kube-api-access-2hc64". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.858426 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d410e17f-e92d-4b85-af7b-2d27431e4b75-scripts" (OuterVolumeSpecName: "scripts") pod "d410e17f-e92d-4b85-af7b-2d27431e4b75" (UID: "d410e17f-e92d-4b85-af7b-2d27431e4b75"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.885343 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d410e17f-e92d-4b85-af7b-2d27431e4b75-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d410e17f-e92d-4b85-af7b-2d27431e4b75" (UID: "d410e17f-e92d-4b85-af7b-2d27431e4b75"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.944099 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d410e17f-e92d-4b85-af7b-2d27431e4b75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d410e17f-e92d-4b85-af7b-2d27431e4b75" (UID: "d410e17f-e92d-4b85-af7b-2d27431e4b75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.956450 4760 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d410e17f-e92d-4b85-af7b-2d27431e4b75-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.956481 4760 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d410e17f-e92d-4b85-af7b-2d27431e4b75-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.956491 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hc64\" (UniqueName: \"kubernetes.io/projected/d410e17f-e92d-4b85-af7b-2d27431e4b75-kube-api-access-2hc64\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.956501 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d410e17f-e92d-4b85-af7b-2d27431e4b75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.956510 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d410e17f-e92d-4b85-af7b-2d27431e4b75-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.956519 4760 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d410e17f-e92d-4b85-af7b-2d27431e4b75-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:54 crc kubenswrapper[4760]: I0930 07:51:54.988771 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d410e17f-e92d-4b85-af7b-2d27431e4b75-config-data" (OuterVolumeSpecName: "config-data") pod "d410e17f-e92d-4b85-af7b-2d27431e4b75" (UID: "d410e17f-e92d-4b85-af7b-2d27431e4b75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.079156 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d410e17f-e92d-4b85-af7b-2d27431e4b75-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.113229 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7b10-account-create-nhsdp" event={"ID":"d94a6d3c-5790-45ac-b75e-02cf0defd846","Type":"ContainerDied","Data":"fa634215ad7c2e04ad575d444199f0ac82e59d9bafbe46af608e480027621386"} Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.114585 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa634215ad7c2e04ad575d444199f0ac82e59d9bafbe46af608e480027621386" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.113280 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7b10-account-create-nhsdp" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.120934 4760 generic.go:334] "Generic (PLEG): container finished" podID="d410e17f-e92d-4b85-af7b-2d27431e4b75" containerID="a90a802e56eee080de1a520ad9bed255f68e7840b50fe4c20f1006ef90e84d41" exitCode=0 Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.121257 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d410e17f-e92d-4b85-af7b-2d27431e4b75","Type":"ContainerDied","Data":"a90a802e56eee080de1a520ad9bed255f68e7840b50fe4c20f1006ef90e84d41"} Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.121353 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d410e17f-e92d-4b85-af7b-2d27431e4b75","Type":"ContainerDied","Data":"6fa6dd4fdc1f8e6a68bea1d00afc1668c115821b2bd1c4455a5277f57dfa2f08"} Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.121380 4760 scope.go:117] "RemoveContainer" containerID="302c476965442aba60f1695567d4b8cbcfb43f5fa7c37b171d3ae7de291f862d" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.121616 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.122310 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.122414 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.173534 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.180156 4760 scope.go:117] "RemoveContainer" containerID="e481f4caccd87adea3b8c6b1aba73eda39dcfe230bef2c73a5a5f95a1a236567" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.193853 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.209499 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:51:55 crc kubenswrapper[4760]: E0930 07:51:55.210027 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d410e17f-e92d-4b85-af7b-2d27431e4b75" containerName="ceilometer-notification-agent" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.210048 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d410e17f-e92d-4b85-af7b-2d27431e4b75" containerName="ceilometer-notification-agent" Sep 30 07:51:55 crc kubenswrapper[4760]: E0930 07:51:55.210061 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7cb900f-4fea-42ed-a186-c173a16463b6" containerName="mariadb-account-create" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.210071 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7cb900f-4fea-42ed-a186-c173a16463b6" containerName="mariadb-account-create" Sep 30 07:51:55 crc kubenswrapper[4760]: E0930 07:51:55.210115 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d410e17f-e92d-4b85-af7b-2d27431e4b75" containerName="proxy-httpd" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.210124 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d410e17f-e92d-4b85-af7b-2d27431e4b75" containerName="proxy-httpd" Sep 30 07:51:55 crc kubenswrapper[4760]: E0930 07:51:55.210140 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb3cbe0b-da3a-48ff-8181-68a3fb8b9db5" containerName="mariadb-account-create" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.210148 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb3cbe0b-da3a-48ff-8181-68a3fb8b9db5" containerName="mariadb-account-create" Sep 30 07:51:55 crc kubenswrapper[4760]: E0930 07:51:55.210178 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d410e17f-e92d-4b85-af7b-2d27431e4b75" containerName="sg-core" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.210186 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d410e17f-e92d-4b85-af7b-2d27431e4b75" containerName="sg-core" Sep 30 07:51:55 crc kubenswrapper[4760]: E0930 07:51:55.210198 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d94a6d3c-5790-45ac-b75e-02cf0defd846" containerName="mariadb-account-create" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.210207 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d94a6d3c-5790-45ac-b75e-02cf0defd846" containerName="mariadb-account-create" Sep 30 07:51:55 crc kubenswrapper[4760]: E0930 07:51:55.210218 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d410e17f-e92d-4b85-af7b-2d27431e4b75" containerName="ceilometer-central-agent" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.210226 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d410e17f-e92d-4b85-af7b-2d27431e4b75" containerName="ceilometer-central-agent" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.210571 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7cb900f-4fea-42ed-a186-c173a16463b6" containerName="mariadb-account-create" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.210597 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="d410e17f-e92d-4b85-af7b-2d27431e4b75" containerName="proxy-httpd" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.210614 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="d410e17f-e92d-4b85-af7b-2d27431e4b75" containerName="ceilometer-central-agent" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.210632 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="d410e17f-e92d-4b85-af7b-2d27431e4b75" containerName="ceilometer-notification-agent" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.210644 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="d410e17f-e92d-4b85-af7b-2d27431e4b75" containerName="sg-core" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.210663 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb3cbe0b-da3a-48ff-8181-68a3fb8b9db5" containerName="mariadb-account-create" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.210676 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="d94a6d3c-5790-45ac-b75e-02cf0defd846" containerName="mariadb-account-create" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.213388 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.216252 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.216566 4760 scope.go:117] "RemoveContainer" containerID="0d4451fa8b93914d9523781207c03908e8c85a996636fd66bf9466a8c552cedb" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.216748 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.217928 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.246639 4760 scope.go:117] "RemoveContainer" containerID="a90a802e56eee080de1a520ad9bed255f68e7840b50fe4c20f1006ef90e84d41" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.278445 4760 scope.go:117] "RemoveContainer" containerID="302c476965442aba60f1695567d4b8cbcfb43f5fa7c37b171d3ae7de291f862d" Sep 30 07:51:55 crc kubenswrapper[4760]: E0930 07:51:55.278760 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"302c476965442aba60f1695567d4b8cbcfb43f5fa7c37b171d3ae7de291f862d\": container with ID starting with 302c476965442aba60f1695567d4b8cbcfb43f5fa7c37b171d3ae7de291f862d not found: ID does not exist" containerID="302c476965442aba60f1695567d4b8cbcfb43f5fa7c37b171d3ae7de291f862d" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.278794 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"302c476965442aba60f1695567d4b8cbcfb43f5fa7c37b171d3ae7de291f862d"} err="failed to get container status \"302c476965442aba60f1695567d4b8cbcfb43f5fa7c37b171d3ae7de291f862d\": rpc error: code = NotFound desc = could not find container \"302c476965442aba60f1695567d4b8cbcfb43f5fa7c37b171d3ae7de291f862d\": container with ID starting with 302c476965442aba60f1695567d4b8cbcfb43f5fa7c37b171d3ae7de291f862d not found: ID does not exist" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.278814 4760 scope.go:117] "RemoveContainer" containerID="e481f4caccd87adea3b8c6b1aba73eda39dcfe230bef2c73a5a5f95a1a236567" Sep 30 07:51:55 crc kubenswrapper[4760]: E0930 07:51:55.279439 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e481f4caccd87adea3b8c6b1aba73eda39dcfe230bef2c73a5a5f95a1a236567\": container with ID starting with e481f4caccd87adea3b8c6b1aba73eda39dcfe230bef2c73a5a5f95a1a236567 not found: ID does not exist" containerID="e481f4caccd87adea3b8c6b1aba73eda39dcfe230bef2c73a5a5f95a1a236567" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.279460 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e481f4caccd87adea3b8c6b1aba73eda39dcfe230bef2c73a5a5f95a1a236567"} err="failed to get container status \"e481f4caccd87adea3b8c6b1aba73eda39dcfe230bef2c73a5a5f95a1a236567\": rpc error: code = NotFound desc = could not find container \"e481f4caccd87adea3b8c6b1aba73eda39dcfe230bef2c73a5a5f95a1a236567\": container with ID starting with e481f4caccd87adea3b8c6b1aba73eda39dcfe230bef2c73a5a5f95a1a236567 not found: ID does not exist" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.279472 4760 scope.go:117] "RemoveContainer" containerID="0d4451fa8b93914d9523781207c03908e8c85a996636fd66bf9466a8c552cedb" Sep 30 07:51:55 crc kubenswrapper[4760]: E0930 07:51:55.279701 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d4451fa8b93914d9523781207c03908e8c85a996636fd66bf9466a8c552cedb\": container with ID starting with 0d4451fa8b93914d9523781207c03908e8c85a996636fd66bf9466a8c552cedb not found: ID does not exist" containerID="0d4451fa8b93914d9523781207c03908e8c85a996636fd66bf9466a8c552cedb" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.279719 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d4451fa8b93914d9523781207c03908e8c85a996636fd66bf9466a8c552cedb"} err="failed to get container status \"0d4451fa8b93914d9523781207c03908e8c85a996636fd66bf9466a8c552cedb\": rpc error: code = NotFound desc = could not find container \"0d4451fa8b93914d9523781207c03908e8c85a996636fd66bf9466a8c552cedb\": container with ID starting with 0d4451fa8b93914d9523781207c03908e8c85a996636fd66bf9466a8c552cedb not found: ID does not exist" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.279729 4760 scope.go:117] "RemoveContainer" containerID="a90a802e56eee080de1a520ad9bed255f68e7840b50fe4c20f1006ef90e84d41" Sep 30 07:51:55 crc kubenswrapper[4760]: E0930 07:51:55.279953 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a90a802e56eee080de1a520ad9bed255f68e7840b50fe4c20f1006ef90e84d41\": container with ID starting with a90a802e56eee080de1a520ad9bed255f68e7840b50fe4c20f1006ef90e84d41 not found: ID does not exist" containerID="a90a802e56eee080de1a520ad9bed255f68e7840b50fe4c20f1006ef90e84d41" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.279968 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a90a802e56eee080de1a520ad9bed255f68e7840b50fe4c20f1006ef90e84d41"} err="failed to get container status \"a90a802e56eee080de1a520ad9bed255f68e7840b50fe4c20f1006ef90e84d41\": rpc error: code = NotFound desc = could not find container \"a90a802e56eee080de1a520ad9bed255f68e7840b50fe4c20f1006ef90e84d41\": container with ID starting with a90a802e56eee080de1a520ad9bed255f68e7840b50fe4c20f1006ef90e84d41 not found: ID does not exist" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.392705 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-log-httpd\") pod \"ceilometer-0\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " pod="openstack/ceilometer-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.392973 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " pod="openstack/ceilometer-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.393006 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " pod="openstack/ceilometer-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.393097 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-run-httpd\") pod \"ceilometer-0\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " pod="openstack/ceilometer-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.393151 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-scripts\") pod \"ceilometer-0\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " pod="openstack/ceilometer-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.393283 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vv7f\" (UniqueName: \"kubernetes.io/projected/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-kube-api-access-7vv7f\") pod \"ceilometer-0\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " pod="openstack/ceilometer-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.393352 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-config-data\") pod \"ceilometer-0\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " pod="openstack/ceilometer-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.494877 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-scripts\") pod \"ceilometer-0\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " pod="openstack/ceilometer-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.495054 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vv7f\" (UniqueName: \"kubernetes.io/projected/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-kube-api-access-7vv7f\") pod \"ceilometer-0\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " pod="openstack/ceilometer-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.495125 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-config-data\") pod \"ceilometer-0\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " pod="openstack/ceilometer-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.495320 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " pod="openstack/ceilometer-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.495353 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-log-httpd\") pod \"ceilometer-0\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " pod="openstack/ceilometer-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.495390 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " pod="openstack/ceilometer-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.495572 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-run-httpd\") pod \"ceilometer-0\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " pod="openstack/ceilometer-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.495997 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-log-httpd\") pod \"ceilometer-0\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " pod="openstack/ceilometer-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.496062 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-run-httpd\") pod \"ceilometer-0\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " pod="openstack/ceilometer-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.500756 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " pod="openstack/ceilometer-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.500876 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-scripts\") pod \"ceilometer-0\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " pod="openstack/ceilometer-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.503136 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-config-data\") pod \"ceilometer-0\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " pod="openstack/ceilometer-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.504060 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " pod="openstack/ceilometer-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.511935 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vv7f\" (UniqueName: \"kubernetes.io/projected/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-kube-api-access-7vv7f\") pod \"ceilometer-0\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " pod="openstack/ceilometer-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.547445 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.663837 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.664235 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.704376 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 07:51:55 crc kubenswrapper[4760]: I0930 07:51:55.732376 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 07:51:56 crc kubenswrapper[4760]: I0930 07:51:56.020601 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:51:56 crc kubenswrapper[4760]: I0930 07:51:56.136250 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822","Type":"ContainerStarted","Data":"a5cd9d8a0d82b4fa95d6f19c8eeb8e7ab2b73aca62779c50279fa9c479507c36"} Sep 30 07:51:56 crc kubenswrapper[4760]: I0930 07:51:56.139421 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 07:51:56 crc kubenswrapper[4760]: I0930 07:51:56.140112 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 07:51:56 crc kubenswrapper[4760]: I0930 07:51:56.374971 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wsmzq"] Sep 30 07:51:56 crc kubenswrapper[4760]: I0930 07:51:56.376083 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wsmzq" Sep 30 07:51:56 crc kubenswrapper[4760]: I0930 07:51:56.378662 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-fw8rm" Sep 30 07:51:56 crc kubenswrapper[4760]: I0930 07:51:56.378669 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 30 07:51:56 crc kubenswrapper[4760]: I0930 07:51:56.378955 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Sep 30 07:51:56 crc kubenswrapper[4760]: I0930 07:51:56.405040 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wsmzq"] Sep 30 07:51:56 crc kubenswrapper[4760]: I0930 07:51:56.517400 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggzfm\" (UniqueName: \"kubernetes.io/projected/152b47cf-da92-44c1-9b68-90cc849f4b74-kube-api-access-ggzfm\") pod \"nova-cell0-conductor-db-sync-wsmzq\" (UID: \"152b47cf-da92-44c1-9b68-90cc849f4b74\") " pod="openstack/nova-cell0-conductor-db-sync-wsmzq" Sep 30 07:51:56 crc kubenswrapper[4760]: I0930 07:51:56.517640 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/152b47cf-da92-44c1-9b68-90cc849f4b74-scripts\") pod \"nova-cell0-conductor-db-sync-wsmzq\" (UID: \"152b47cf-da92-44c1-9b68-90cc849f4b74\") " pod="openstack/nova-cell0-conductor-db-sync-wsmzq" Sep 30 07:51:56 crc kubenswrapper[4760]: I0930 07:51:56.517684 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/152b47cf-da92-44c1-9b68-90cc849f4b74-config-data\") pod \"nova-cell0-conductor-db-sync-wsmzq\" (UID: \"152b47cf-da92-44c1-9b68-90cc849f4b74\") " pod="openstack/nova-cell0-conductor-db-sync-wsmzq" Sep 30 07:51:56 crc kubenswrapper[4760]: I0930 07:51:56.517759 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152b47cf-da92-44c1-9b68-90cc849f4b74-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wsmzq\" (UID: \"152b47cf-da92-44c1-9b68-90cc849f4b74\") " pod="openstack/nova-cell0-conductor-db-sync-wsmzq" Sep 30 07:51:56 crc kubenswrapper[4760]: I0930 07:51:56.619119 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggzfm\" (UniqueName: \"kubernetes.io/projected/152b47cf-da92-44c1-9b68-90cc849f4b74-kube-api-access-ggzfm\") pod \"nova-cell0-conductor-db-sync-wsmzq\" (UID: \"152b47cf-da92-44c1-9b68-90cc849f4b74\") " pod="openstack/nova-cell0-conductor-db-sync-wsmzq" Sep 30 07:51:56 crc kubenswrapper[4760]: I0930 07:51:56.619165 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/152b47cf-da92-44c1-9b68-90cc849f4b74-scripts\") pod \"nova-cell0-conductor-db-sync-wsmzq\" (UID: \"152b47cf-da92-44c1-9b68-90cc849f4b74\") " pod="openstack/nova-cell0-conductor-db-sync-wsmzq" Sep 30 07:51:56 crc kubenswrapper[4760]: I0930 07:51:56.619212 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/152b47cf-da92-44c1-9b68-90cc849f4b74-config-data\") pod \"nova-cell0-conductor-db-sync-wsmzq\" (UID: \"152b47cf-da92-44c1-9b68-90cc849f4b74\") " pod="openstack/nova-cell0-conductor-db-sync-wsmzq" Sep 30 07:51:56 crc kubenswrapper[4760]: I0930 07:51:56.619318 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152b47cf-da92-44c1-9b68-90cc849f4b74-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wsmzq\" (UID: \"152b47cf-da92-44c1-9b68-90cc849f4b74\") " pod="openstack/nova-cell0-conductor-db-sync-wsmzq" Sep 30 07:51:56 crc kubenswrapper[4760]: I0930 07:51:56.623134 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/152b47cf-da92-44c1-9b68-90cc849f4b74-config-data\") pod \"nova-cell0-conductor-db-sync-wsmzq\" (UID: \"152b47cf-da92-44c1-9b68-90cc849f4b74\") " pod="openstack/nova-cell0-conductor-db-sync-wsmzq" Sep 30 07:51:56 crc kubenswrapper[4760]: I0930 07:51:56.623184 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/152b47cf-da92-44c1-9b68-90cc849f4b74-scripts\") pod \"nova-cell0-conductor-db-sync-wsmzq\" (UID: \"152b47cf-da92-44c1-9b68-90cc849f4b74\") " pod="openstack/nova-cell0-conductor-db-sync-wsmzq" Sep 30 07:51:56 crc kubenswrapper[4760]: I0930 07:51:56.623282 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152b47cf-da92-44c1-9b68-90cc849f4b74-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wsmzq\" (UID: \"152b47cf-da92-44c1-9b68-90cc849f4b74\") " pod="openstack/nova-cell0-conductor-db-sync-wsmzq" Sep 30 07:51:56 crc kubenswrapper[4760]: I0930 07:51:56.641752 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggzfm\" (UniqueName: \"kubernetes.io/projected/152b47cf-da92-44c1-9b68-90cc849f4b74-kube-api-access-ggzfm\") pod \"nova-cell0-conductor-db-sync-wsmzq\" (UID: \"152b47cf-da92-44c1-9b68-90cc849f4b74\") " pod="openstack/nova-cell0-conductor-db-sync-wsmzq" Sep 30 07:51:56 crc kubenswrapper[4760]: I0930 07:51:56.698017 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wsmzq" Sep 30 07:51:57 crc kubenswrapper[4760]: I0930 07:51:57.080341 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d410e17f-e92d-4b85-af7b-2d27431e4b75" path="/var/lib/kubelet/pods/d410e17f-e92d-4b85-af7b-2d27431e4b75/volumes" Sep 30 07:51:57 crc kubenswrapper[4760]: I0930 07:51:57.148481 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822","Type":"ContainerStarted","Data":"45fc9ee790e2cbd08079379f05e81fc29acbe94ffd1605279ea8b52680b2323b"} Sep 30 07:51:57 crc kubenswrapper[4760]: I0930 07:51:57.148922 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 07:51:57 crc kubenswrapper[4760]: I0930 07:51:57.148942 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 07:51:57 crc kubenswrapper[4760]: I0930 07:51:57.288763 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wsmzq"] Sep 30 07:51:57 crc kubenswrapper[4760]: I0930 07:51:57.456294 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 07:51:57 crc kubenswrapper[4760]: I0930 07:51:57.458856 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 07:51:58 crc kubenswrapper[4760]: I0930 07:51:58.195548 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822","Type":"ContainerStarted","Data":"52b9b7ac724d0e6144c9b271ae11f93c0f6bad3eaa320c93fd6f739e7c494a52"} Sep 30 07:51:58 crc kubenswrapper[4760]: I0930 07:51:58.197062 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wsmzq" event={"ID":"152b47cf-da92-44c1-9b68-90cc849f4b74","Type":"ContainerStarted","Data":"2c722d9351d7fd85839ca27f0b6cf5c80d5100388e75591584c6cdc3f69a6cc3"} Sep 30 07:51:58 crc kubenswrapper[4760]: I0930 07:51:58.197178 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 07:51:58 crc kubenswrapper[4760]: I0930 07:51:58.197199 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 07:51:58 crc kubenswrapper[4760]: I0930 07:51:58.322378 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 07:51:58 crc kubenswrapper[4760]: I0930 07:51:58.373033 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 07:51:59 crc kubenswrapper[4760]: I0930 07:51:59.214419 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822","Type":"ContainerStarted","Data":"37c68203b847ab8c3b937c49eae5cda3e495e09802d4a39ea8acdbfa03449226"} Sep 30 07:51:59 crc kubenswrapper[4760]: I0930 07:51:59.649377 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 30 07:52:00 crc kubenswrapper[4760]: I0930 07:52:00.229815 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822","Type":"ContainerStarted","Data":"9e8c190d132e0c7a6205ca300eb43f0b3906c1a1710f6820491c1bebc9daab13"} Sep 30 07:52:00 crc kubenswrapper[4760]: I0930 07:52:00.252472 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8711627320000002 podStartE2EDuration="5.252454868s" podCreationTimestamp="2025-09-30 07:51:55 +0000 UTC" firstStartedPulling="2025-09-30 07:51:56.025426441 +0000 UTC m=+1101.668332843" lastFinishedPulling="2025-09-30 07:51:59.406718567 +0000 UTC m=+1105.049624979" observedRunningTime="2025-09-30 07:52:00.24902583 +0000 UTC m=+1105.891932242" watchObservedRunningTime="2025-09-30 07:52:00.252454868 +0000 UTC m=+1105.895361280" Sep 30 07:52:01 crc kubenswrapper[4760]: I0930 07:52:01.245648 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 07:52:01 crc kubenswrapper[4760]: I0930 07:52:01.730091 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:52:03 crc kubenswrapper[4760]: I0930 07:52:03.265717 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" containerName="ceilometer-central-agent" containerID="cri-o://45fc9ee790e2cbd08079379f05e81fc29acbe94ffd1605279ea8b52680b2323b" gracePeriod=30 Sep 30 07:52:03 crc kubenswrapper[4760]: I0930 07:52:03.266159 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" containerName="proxy-httpd" containerID="cri-o://9e8c190d132e0c7a6205ca300eb43f0b3906c1a1710f6820491c1bebc9daab13" gracePeriod=30 Sep 30 07:52:03 crc kubenswrapper[4760]: I0930 07:52:03.266251 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" containerName="sg-core" containerID="cri-o://37c68203b847ab8c3b937c49eae5cda3e495e09802d4a39ea8acdbfa03449226" gracePeriod=30 Sep 30 07:52:03 crc kubenswrapper[4760]: I0930 07:52:03.266341 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" containerName="ceilometer-notification-agent" containerID="cri-o://52b9b7ac724d0e6144c9b271ae11f93c0f6bad3eaa320c93fd6f739e7c494a52" gracePeriod=30 Sep 30 07:52:04 crc kubenswrapper[4760]: I0930 07:52:04.280540 4760 generic.go:334] "Generic (PLEG): container finished" podID="07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" containerID="9e8c190d132e0c7a6205ca300eb43f0b3906c1a1710f6820491c1bebc9daab13" exitCode=0 Sep 30 07:52:04 crc kubenswrapper[4760]: I0930 07:52:04.280891 4760 generic.go:334] "Generic (PLEG): container finished" podID="07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" containerID="37c68203b847ab8c3b937c49eae5cda3e495e09802d4a39ea8acdbfa03449226" exitCode=2 Sep 30 07:52:04 crc kubenswrapper[4760]: I0930 07:52:04.280904 4760 generic.go:334] "Generic (PLEG): container finished" podID="07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" containerID="52b9b7ac724d0e6144c9b271ae11f93c0f6bad3eaa320c93fd6f739e7c494a52" exitCode=0 Sep 30 07:52:04 crc kubenswrapper[4760]: I0930 07:52:04.280911 4760 generic.go:334] "Generic (PLEG): container finished" podID="07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" containerID="45fc9ee790e2cbd08079379f05e81fc29acbe94ffd1605279ea8b52680b2323b" exitCode=0 Sep 30 07:52:04 crc kubenswrapper[4760]: I0930 07:52:04.280597 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822","Type":"ContainerDied","Data":"9e8c190d132e0c7a6205ca300eb43f0b3906c1a1710f6820491c1bebc9daab13"} Sep 30 07:52:04 crc kubenswrapper[4760]: I0930 07:52:04.280954 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822","Type":"ContainerDied","Data":"37c68203b847ab8c3b937c49eae5cda3e495e09802d4a39ea8acdbfa03449226"} Sep 30 07:52:04 crc kubenswrapper[4760]: I0930 07:52:04.280970 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822","Type":"ContainerDied","Data":"52b9b7ac724d0e6144c9b271ae11f93c0f6bad3eaa320c93fd6f739e7c494a52"} Sep 30 07:52:04 crc kubenswrapper[4760]: I0930 07:52:04.280981 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822","Type":"ContainerDied","Data":"45fc9ee790e2cbd08079379f05e81fc29acbe94ffd1605279ea8b52680b2323b"} Sep 30 07:52:05 crc kubenswrapper[4760]: I0930 07:52:05.992551 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.022866 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vv7f\" (UniqueName: \"kubernetes.io/projected/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-kube-api-access-7vv7f\") pod \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.022941 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-combined-ca-bundle\") pod \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.023069 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-scripts\") pod \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.023165 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-sg-core-conf-yaml\") pod \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.023232 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-config-data\") pod \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.023283 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-log-httpd\") pod \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.023342 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-run-httpd\") pod \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\" (UID: \"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822\") " Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.024591 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" (UID: "07118ef0-0fb4-4ea3-b4d2-6aa1021aa822"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.024664 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" (UID: "07118ef0-0fb4-4ea3-b4d2-6aa1021aa822"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.030653 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-kube-api-access-7vv7f" (OuterVolumeSpecName: "kube-api-access-7vv7f") pod "07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" (UID: "07118ef0-0fb4-4ea3-b4d2-6aa1021aa822"). InnerVolumeSpecName "kube-api-access-7vv7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.030793 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-scripts" (OuterVolumeSpecName: "scripts") pod "07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" (UID: "07118ef0-0fb4-4ea3-b4d2-6aa1021aa822"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.051646 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" (UID: "07118ef0-0fb4-4ea3-b4d2-6aa1021aa822"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.099078 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" (UID: "07118ef0-0fb4-4ea3-b4d2-6aa1021aa822"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.125770 4760 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.125833 4760 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.125904 4760 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.125918 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vv7f\" (UniqueName: \"kubernetes.io/projected/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-kube-api-access-7vv7f\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.125934 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.125947 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.155776 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-config-data" (OuterVolumeSpecName: "config-data") pod "07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" (UID: "07118ef0-0fb4-4ea3-b4d2-6aa1021aa822"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.228594 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.306741 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07118ef0-0fb4-4ea3-b4d2-6aa1021aa822","Type":"ContainerDied","Data":"a5cd9d8a0d82b4fa95d6f19c8eeb8e7ab2b73aca62779c50279fa9c479507c36"} Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.306812 4760 scope.go:117] "RemoveContainer" containerID="9e8c190d132e0c7a6205ca300eb43f0b3906c1a1710f6820491c1bebc9daab13" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.307026 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.310084 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wsmzq" event={"ID":"152b47cf-da92-44c1-9b68-90cc849f4b74","Type":"ContainerStarted","Data":"478b02568ea9ae9e32dccd85f91feb1777b134982af0cc2b37f6ebb0703477c8"} Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.349486 4760 scope.go:117] "RemoveContainer" containerID="37c68203b847ab8c3b937c49eae5cda3e495e09802d4a39ea8acdbfa03449226" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.361577 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-wsmzq" podStartSLOduration=1.9924577879999998 podStartE2EDuration="10.361545186s" podCreationTimestamp="2025-09-30 07:51:56 +0000 UTC" firstStartedPulling="2025-09-30 07:51:57.303863537 +0000 UTC m=+1102.946769949" lastFinishedPulling="2025-09-30 07:52:05.672950935 +0000 UTC m=+1111.315857347" observedRunningTime="2025-09-30 07:52:06.339973036 +0000 UTC m=+1111.982879528" watchObservedRunningTime="2025-09-30 07:52:06.361545186 +0000 UTC m=+1112.004451638" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.372550 4760 scope.go:117] "RemoveContainer" containerID="52b9b7ac724d0e6144c9b271ae11f93c0f6bad3eaa320c93fd6f739e7c494a52" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.376012 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.385327 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.397439 4760 scope.go:117] "RemoveContainer" containerID="45fc9ee790e2cbd08079379f05e81fc29acbe94ffd1605279ea8b52680b2323b" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.423767 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:52:06 crc kubenswrapper[4760]: E0930 07:52:06.424267 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" containerName="proxy-httpd" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.424287 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" containerName="proxy-httpd" Sep 30 07:52:06 crc kubenswrapper[4760]: E0930 07:52:06.424338 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" containerName="sg-core" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.424353 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" containerName="sg-core" Sep 30 07:52:06 crc kubenswrapper[4760]: E0930 07:52:06.424388 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" containerName="ceilometer-notification-agent" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.424399 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" containerName="ceilometer-notification-agent" Sep 30 07:52:06 crc kubenswrapper[4760]: E0930 07:52:06.424420 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" containerName="ceilometer-central-agent" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.424430 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" containerName="ceilometer-central-agent" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.424870 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" containerName="sg-core" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.424900 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" containerName="ceilometer-notification-agent" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.424916 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" containerName="proxy-httpd" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.424942 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" containerName="ceilometer-central-agent" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.427570 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.431912 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.432665 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.452035 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.536093 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaa10928-35f2-46a8-82e2-b6569a81187d-scripts\") pod \"ceilometer-0\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " pod="openstack/ceilometer-0" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.536213 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaa10928-35f2-46a8-82e2-b6569a81187d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " pod="openstack/ceilometer-0" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.536241 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaa10928-35f2-46a8-82e2-b6569a81187d-log-httpd\") pod \"ceilometer-0\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " pod="openstack/ceilometer-0" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.536275 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaa10928-35f2-46a8-82e2-b6569a81187d-run-httpd\") pod \"ceilometer-0\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " pod="openstack/ceilometer-0" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.536559 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaa10928-35f2-46a8-82e2-b6569a81187d-config-data\") pod \"ceilometer-0\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " pod="openstack/ceilometer-0" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.536669 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck9pm\" (UniqueName: \"kubernetes.io/projected/eaa10928-35f2-46a8-82e2-b6569a81187d-kube-api-access-ck9pm\") pod \"ceilometer-0\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " pod="openstack/ceilometer-0" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.536723 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eaa10928-35f2-46a8-82e2-b6569a81187d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " pod="openstack/ceilometer-0" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.638482 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck9pm\" (UniqueName: \"kubernetes.io/projected/eaa10928-35f2-46a8-82e2-b6569a81187d-kube-api-access-ck9pm\") pod \"ceilometer-0\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " pod="openstack/ceilometer-0" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.638532 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eaa10928-35f2-46a8-82e2-b6569a81187d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " pod="openstack/ceilometer-0" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.638697 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaa10928-35f2-46a8-82e2-b6569a81187d-scripts\") pod \"ceilometer-0\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " pod="openstack/ceilometer-0" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.638776 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaa10928-35f2-46a8-82e2-b6569a81187d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " pod="openstack/ceilometer-0" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.638804 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaa10928-35f2-46a8-82e2-b6569a81187d-log-httpd\") pod \"ceilometer-0\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " pod="openstack/ceilometer-0" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.638841 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaa10928-35f2-46a8-82e2-b6569a81187d-run-httpd\") pod \"ceilometer-0\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " pod="openstack/ceilometer-0" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.638870 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaa10928-35f2-46a8-82e2-b6569a81187d-config-data\") pod \"ceilometer-0\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " pod="openstack/ceilometer-0" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.643199 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eaa10928-35f2-46a8-82e2-b6569a81187d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " pod="openstack/ceilometer-0" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.643915 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaa10928-35f2-46a8-82e2-b6569a81187d-log-httpd\") pod \"ceilometer-0\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " pod="openstack/ceilometer-0" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.644467 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaa10928-35f2-46a8-82e2-b6569a81187d-run-httpd\") pod \"ceilometer-0\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " pod="openstack/ceilometer-0" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.644601 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaa10928-35f2-46a8-82e2-b6569a81187d-config-data\") pod \"ceilometer-0\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " pod="openstack/ceilometer-0" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.645794 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaa10928-35f2-46a8-82e2-b6569a81187d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " pod="openstack/ceilometer-0" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.648832 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaa10928-35f2-46a8-82e2-b6569a81187d-scripts\") pod \"ceilometer-0\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " pod="openstack/ceilometer-0" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.659052 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck9pm\" (UniqueName: \"kubernetes.io/projected/eaa10928-35f2-46a8-82e2-b6569a81187d-kube-api-access-ck9pm\") pod \"ceilometer-0\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " pod="openstack/ceilometer-0" Sep 30 07:52:06 crc kubenswrapper[4760]: I0930 07:52:06.758645 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:52:07 crc kubenswrapper[4760]: I0930 07:52:07.080424 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07118ef0-0fb4-4ea3-b4d2-6aa1021aa822" path="/var/lib/kubelet/pods/07118ef0-0fb4-4ea3-b4d2-6aa1021aa822/volumes" Sep 30 07:52:07 crc kubenswrapper[4760]: I0930 07:52:07.278835 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:52:07 crc kubenswrapper[4760]: W0930 07:52:07.290673 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeaa10928_35f2_46a8_82e2_b6569a81187d.slice/crio-79acad6e9216cec8637889a3750e606d0ecb648ab63484ed37662c246f1169d5 WatchSource:0}: Error finding container 79acad6e9216cec8637889a3750e606d0ecb648ab63484ed37662c246f1169d5: Status 404 returned error can't find the container with id 79acad6e9216cec8637889a3750e606d0ecb648ab63484ed37662c246f1169d5 Sep 30 07:52:07 crc kubenswrapper[4760]: I0930 07:52:07.325461 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaa10928-35f2-46a8-82e2-b6569a81187d","Type":"ContainerStarted","Data":"79acad6e9216cec8637889a3750e606d0ecb648ab63484ed37662c246f1169d5"} Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.293900 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.344405 4760 generic.go:334] "Generic (PLEG): container finished" podID="40978ac5-870c-4273-8814-6c735435ca09" containerID="061372a9fcd93827a2752dbcf01a54dd7e453ef744896f6a50abbf06dbd70c83" exitCode=137 Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.344455 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.344446 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"40978ac5-870c-4273-8814-6c735435ca09","Type":"ContainerDied","Data":"061372a9fcd93827a2752dbcf01a54dd7e453ef744896f6a50abbf06dbd70c83"} Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.344584 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"40978ac5-870c-4273-8814-6c735435ca09","Type":"ContainerDied","Data":"4807dcc5098f4e4f8b16f109246c0e564c9f5bc46bb6a764dc45c34b7a0beec8"} Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.344611 4760 scope.go:117] "RemoveContainer" containerID="061372a9fcd93827a2752dbcf01a54dd7e453ef744896f6a50abbf06dbd70c83" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.348914 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaa10928-35f2-46a8-82e2-b6569a81187d","Type":"ContainerStarted","Data":"0b7daf12868ebf9d853e0914a69d76d36dd87684c2dd01f4ab43ba3d656af8f8"} Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.374983 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40978ac5-870c-4273-8814-6c735435ca09-combined-ca-bundle\") pod \"40978ac5-870c-4273-8814-6c735435ca09\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.375929 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/40978ac5-870c-4273-8814-6c735435ca09-etc-machine-id\") pod \"40978ac5-870c-4273-8814-6c735435ca09\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.376216 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40978ac5-870c-4273-8814-6c735435ca09-config-data-custom\") pod \"40978ac5-870c-4273-8814-6c735435ca09\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.376343 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wwpc\" (UniqueName: \"kubernetes.io/projected/40978ac5-870c-4273-8814-6c735435ca09-kube-api-access-9wwpc\") pod \"40978ac5-870c-4273-8814-6c735435ca09\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.376417 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40978ac5-870c-4273-8814-6c735435ca09-logs\") pod \"40978ac5-870c-4273-8814-6c735435ca09\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.376458 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40978ac5-870c-4273-8814-6c735435ca09-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "40978ac5-870c-4273-8814-6c735435ca09" (UID: "40978ac5-870c-4273-8814-6c735435ca09"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.376475 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40978ac5-870c-4273-8814-6c735435ca09-config-data\") pod \"40978ac5-870c-4273-8814-6c735435ca09\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.376572 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40978ac5-870c-4273-8814-6c735435ca09-scripts\") pod \"40978ac5-870c-4273-8814-6c735435ca09\" (UID: \"40978ac5-870c-4273-8814-6c735435ca09\") " Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.377494 4760 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/40978ac5-870c-4273-8814-6c735435ca09-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.377567 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40978ac5-870c-4273-8814-6c735435ca09-logs" (OuterVolumeSpecName: "logs") pod "40978ac5-870c-4273-8814-6c735435ca09" (UID: "40978ac5-870c-4273-8814-6c735435ca09"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.382327 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40978ac5-870c-4273-8814-6c735435ca09-kube-api-access-9wwpc" (OuterVolumeSpecName: "kube-api-access-9wwpc") pod "40978ac5-870c-4273-8814-6c735435ca09" (UID: "40978ac5-870c-4273-8814-6c735435ca09"). InnerVolumeSpecName "kube-api-access-9wwpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.382437 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40978ac5-870c-4273-8814-6c735435ca09-scripts" (OuterVolumeSpecName: "scripts") pod "40978ac5-870c-4273-8814-6c735435ca09" (UID: "40978ac5-870c-4273-8814-6c735435ca09"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.385487 4760 scope.go:117] "RemoveContainer" containerID="6b46cedfeed3f825bd3f1275c8c2908927493d56868f67bff2c999e1dbd9c1c5" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.385692 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40978ac5-870c-4273-8814-6c735435ca09-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "40978ac5-870c-4273-8814-6c735435ca09" (UID: "40978ac5-870c-4273-8814-6c735435ca09"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.421709 4760 scope.go:117] "RemoveContainer" containerID="061372a9fcd93827a2752dbcf01a54dd7e453ef744896f6a50abbf06dbd70c83" Sep 30 07:52:08 crc kubenswrapper[4760]: E0930 07:52:08.422768 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"061372a9fcd93827a2752dbcf01a54dd7e453ef744896f6a50abbf06dbd70c83\": container with ID starting with 061372a9fcd93827a2752dbcf01a54dd7e453ef744896f6a50abbf06dbd70c83 not found: ID does not exist" containerID="061372a9fcd93827a2752dbcf01a54dd7e453ef744896f6a50abbf06dbd70c83" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.422819 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"061372a9fcd93827a2752dbcf01a54dd7e453ef744896f6a50abbf06dbd70c83"} err="failed to get container status \"061372a9fcd93827a2752dbcf01a54dd7e453ef744896f6a50abbf06dbd70c83\": rpc error: code = NotFound desc = could not find container \"061372a9fcd93827a2752dbcf01a54dd7e453ef744896f6a50abbf06dbd70c83\": container with ID starting with 061372a9fcd93827a2752dbcf01a54dd7e453ef744896f6a50abbf06dbd70c83 not found: ID does not exist" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.422846 4760 scope.go:117] "RemoveContainer" containerID="6b46cedfeed3f825bd3f1275c8c2908927493d56868f67bff2c999e1dbd9c1c5" Sep 30 07:52:08 crc kubenswrapper[4760]: E0930 07:52:08.423805 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b46cedfeed3f825bd3f1275c8c2908927493d56868f67bff2c999e1dbd9c1c5\": container with ID starting with 6b46cedfeed3f825bd3f1275c8c2908927493d56868f67bff2c999e1dbd9c1c5 not found: ID does not exist" containerID="6b46cedfeed3f825bd3f1275c8c2908927493d56868f67bff2c999e1dbd9c1c5" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.423949 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b46cedfeed3f825bd3f1275c8c2908927493d56868f67bff2c999e1dbd9c1c5"} err="failed to get container status \"6b46cedfeed3f825bd3f1275c8c2908927493d56868f67bff2c999e1dbd9c1c5\": rpc error: code = NotFound desc = could not find container \"6b46cedfeed3f825bd3f1275c8c2908927493d56868f67bff2c999e1dbd9c1c5\": container with ID starting with 6b46cedfeed3f825bd3f1275c8c2908927493d56868f67bff2c999e1dbd9c1c5 not found: ID does not exist" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.424556 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40978ac5-870c-4273-8814-6c735435ca09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40978ac5-870c-4273-8814-6c735435ca09" (UID: "40978ac5-870c-4273-8814-6c735435ca09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.468866 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40978ac5-870c-4273-8814-6c735435ca09-config-data" (OuterVolumeSpecName: "config-data") pod "40978ac5-870c-4273-8814-6c735435ca09" (UID: "40978ac5-870c-4273-8814-6c735435ca09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.479750 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40978ac5-870c-4273-8814-6c735435ca09-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.479914 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40978ac5-870c-4273-8814-6c735435ca09-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.479972 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40978ac5-870c-4273-8814-6c735435ca09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.480044 4760 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40978ac5-870c-4273-8814-6c735435ca09-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.480132 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wwpc\" (UniqueName: \"kubernetes.io/projected/40978ac5-870c-4273-8814-6c735435ca09-kube-api-access-9wwpc\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.480210 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40978ac5-870c-4273-8814-6c735435ca09-logs\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.673178 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.679945 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.693340 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 30 07:52:08 crc kubenswrapper[4760]: E0930 07:52:08.693738 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40978ac5-870c-4273-8814-6c735435ca09" containerName="cinder-api-log" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.693753 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="40978ac5-870c-4273-8814-6c735435ca09" containerName="cinder-api-log" Sep 30 07:52:08 crc kubenswrapper[4760]: E0930 07:52:08.693768 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40978ac5-870c-4273-8814-6c735435ca09" containerName="cinder-api" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.693775 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="40978ac5-870c-4273-8814-6c735435ca09" containerName="cinder-api" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.693955 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="40978ac5-870c-4273-8814-6c735435ca09" containerName="cinder-api-log" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.693977 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="40978ac5-870c-4273-8814-6c735435ca09" containerName="cinder-api" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.694910 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.696396 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.696456 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.699763 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.724367 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.786255 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce9e61a5-93ea-4bc8-bb73-0578fe123aae-config-data\") pod \"cinder-api-0\" (UID: \"ce9e61a5-93ea-4bc8-bb73-0578fe123aae\") " pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.786418 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce9e61a5-93ea-4bc8-bb73-0578fe123aae-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ce9e61a5-93ea-4bc8-bb73-0578fe123aae\") " pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.786452 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce9e61a5-93ea-4bc8-bb73-0578fe123aae-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ce9e61a5-93ea-4bc8-bb73-0578fe123aae\") " pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.786484 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce9e61a5-93ea-4bc8-bb73-0578fe123aae-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ce9e61a5-93ea-4bc8-bb73-0578fe123aae\") " pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.786522 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce9e61a5-93ea-4bc8-bb73-0578fe123aae-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ce9e61a5-93ea-4bc8-bb73-0578fe123aae\") " pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.786552 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj9qg\" (UniqueName: \"kubernetes.io/projected/ce9e61a5-93ea-4bc8-bb73-0578fe123aae-kube-api-access-cj9qg\") pod \"cinder-api-0\" (UID: \"ce9e61a5-93ea-4bc8-bb73-0578fe123aae\") " pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.786632 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce9e61a5-93ea-4bc8-bb73-0578fe123aae-logs\") pod \"cinder-api-0\" (UID: \"ce9e61a5-93ea-4bc8-bb73-0578fe123aae\") " pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.786662 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce9e61a5-93ea-4bc8-bb73-0578fe123aae-config-data-custom\") pod \"cinder-api-0\" (UID: \"ce9e61a5-93ea-4bc8-bb73-0578fe123aae\") " pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.786700 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce9e61a5-93ea-4bc8-bb73-0578fe123aae-scripts\") pod \"cinder-api-0\" (UID: \"ce9e61a5-93ea-4bc8-bb73-0578fe123aae\") " pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.887910 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce9e61a5-93ea-4bc8-bb73-0578fe123aae-config-data\") pod \"cinder-api-0\" (UID: \"ce9e61a5-93ea-4bc8-bb73-0578fe123aae\") " pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.887971 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce9e61a5-93ea-4bc8-bb73-0578fe123aae-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ce9e61a5-93ea-4bc8-bb73-0578fe123aae\") " pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.888000 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce9e61a5-93ea-4bc8-bb73-0578fe123aae-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ce9e61a5-93ea-4bc8-bb73-0578fe123aae\") " pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.888023 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce9e61a5-93ea-4bc8-bb73-0578fe123aae-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ce9e61a5-93ea-4bc8-bb73-0578fe123aae\") " pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.888054 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce9e61a5-93ea-4bc8-bb73-0578fe123aae-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ce9e61a5-93ea-4bc8-bb73-0578fe123aae\") " pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.888077 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj9qg\" (UniqueName: \"kubernetes.io/projected/ce9e61a5-93ea-4bc8-bb73-0578fe123aae-kube-api-access-cj9qg\") pod \"cinder-api-0\" (UID: \"ce9e61a5-93ea-4bc8-bb73-0578fe123aae\") " pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.888144 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce9e61a5-93ea-4bc8-bb73-0578fe123aae-logs\") pod \"cinder-api-0\" (UID: \"ce9e61a5-93ea-4bc8-bb73-0578fe123aae\") " pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.888176 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce9e61a5-93ea-4bc8-bb73-0578fe123aae-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ce9e61a5-93ea-4bc8-bb73-0578fe123aae\") " pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.888202 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce9e61a5-93ea-4bc8-bb73-0578fe123aae-config-data-custom\") pod \"cinder-api-0\" (UID: \"ce9e61a5-93ea-4bc8-bb73-0578fe123aae\") " pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.888388 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce9e61a5-93ea-4bc8-bb73-0578fe123aae-scripts\") pod \"cinder-api-0\" (UID: \"ce9e61a5-93ea-4bc8-bb73-0578fe123aae\") " pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.889046 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce9e61a5-93ea-4bc8-bb73-0578fe123aae-logs\") pod \"cinder-api-0\" (UID: \"ce9e61a5-93ea-4bc8-bb73-0578fe123aae\") " pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.893760 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce9e61a5-93ea-4bc8-bb73-0578fe123aae-scripts\") pod \"cinder-api-0\" (UID: \"ce9e61a5-93ea-4bc8-bb73-0578fe123aae\") " pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.894321 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce9e61a5-93ea-4bc8-bb73-0578fe123aae-config-data-custom\") pod \"cinder-api-0\" (UID: \"ce9e61a5-93ea-4bc8-bb73-0578fe123aae\") " pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.894636 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce9e61a5-93ea-4bc8-bb73-0578fe123aae-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ce9e61a5-93ea-4bc8-bb73-0578fe123aae\") " pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.894813 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce9e61a5-93ea-4bc8-bb73-0578fe123aae-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ce9e61a5-93ea-4bc8-bb73-0578fe123aae\") " pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.895369 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce9e61a5-93ea-4bc8-bb73-0578fe123aae-config-data\") pod \"cinder-api-0\" (UID: \"ce9e61a5-93ea-4bc8-bb73-0578fe123aae\") " pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.895768 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce9e61a5-93ea-4bc8-bb73-0578fe123aae-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ce9e61a5-93ea-4bc8-bb73-0578fe123aae\") " pod="openstack/cinder-api-0" Sep 30 07:52:08 crc kubenswrapper[4760]: I0930 07:52:08.915505 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj9qg\" (UniqueName: \"kubernetes.io/projected/ce9e61a5-93ea-4bc8-bb73-0578fe123aae-kube-api-access-cj9qg\") pod \"cinder-api-0\" (UID: \"ce9e61a5-93ea-4bc8-bb73-0578fe123aae\") " pod="openstack/cinder-api-0" Sep 30 07:52:09 crc kubenswrapper[4760]: I0930 07:52:09.015730 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 07:52:09 crc kubenswrapper[4760]: I0930 07:52:09.081147 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40978ac5-870c-4273-8814-6c735435ca09" path="/var/lib/kubelet/pods/40978ac5-870c-4273-8814-6c735435ca09/volumes" Sep 30 07:52:09 crc kubenswrapper[4760]: I0930 07:52:09.378355 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaa10928-35f2-46a8-82e2-b6569a81187d","Type":"ContainerStarted","Data":"8ad2037f68588817853afa08aec9a4e75fd5f1270a2fe780394aa365a27b5a75"} Sep 30 07:52:09 crc kubenswrapper[4760]: I0930 07:52:09.514908 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 07:52:09 crc kubenswrapper[4760]: W0930 07:52:09.526246 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce9e61a5_93ea_4bc8_bb73_0578fe123aae.slice/crio-357febde0bc9f326f805df57276962dfd2f13f4ff5f45bd8f24a064753dd24b4 WatchSource:0}: Error finding container 357febde0bc9f326f805df57276962dfd2f13f4ff5f45bd8f24a064753dd24b4: Status 404 returned error can't find the container with id 357febde0bc9f326f805df57276962dfd2f13f4ff5f45bd8f24a064753dd24b4 Sep 30 07:52:10 crc kubenswrapper[4760]: I0930 07:52:10.393447 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ce9e61a5-93ea-4bc8-bb73-0578fe123aae","Type":"ContainerStarted","Data":"78245103fe83fc426c37a967121330c62dffb0a623cb2c00c64151d1fcc661af"} Sep 30 07:52:10 crc kubenswrapper[4760]: I0930 07:52:10.394042 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ce9e61a5-93ea-4bc8-bb73-0578fe123aae","Type":"ContainerStarted","Data":"357febde0bc9f326f805df57276962dfd2f13f4ff5f45bd8f24a064753dd24b4"} Sep 30 07:52:10 crc kubenswrapper[4760]: I0930 07:52:10.395594 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaa10928-35f2-46a8-82e2-b6569a81187d","Type":"ContainerStarted","Data":"72d895c026a810a814b8f0274f0762c27f77f307451a38e805851d97c6a3b4cb"} Sep 30 07:52:11 crc kubenswrapper[4760]: I0930 07:52:11.414733 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaa10928-35f2-46a8-82e2-b6569a81187d","Type":"ContainerStarted","Data":"1773ca8a16bacd865cd3b9aadd12895a310b3ea1cef9d4618ee501f943a6d75c"} Sep 30 07:52:11 crc kubenswrapper[4760]: I0930 07:52:11.415253 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 07:52:11 crc kubenswrapper[4760]: I0930 07:52:11.419242 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ce9e61a5-93ea-4bc8-bb73-0578fe123aae","Type":"ContainerStarted","Data":"6ad21537fd62fb93d02261c668cafeea7d265a7cdfee68cf0a5f1aa146ab7ab2"} Sep 30 07:52:11 crc kubenswrapper[4760]: I0930 07:52:11.458625 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.301221082 podStartE2EDuration="5.458605401s" podCreationTimestamp="2025-09-30 07:52:06 +0000 UTC" firstStartedPulling="2025-09-30 07:52:07.293044186 +0000 UTC m=+1112.935950608" lastFinishedPulling="2025-09-30 07:52:10.450428515 +0000 UTC m=+1116.093334927" observedRunningTime="2025-09-30 07:52:11.453005948 +0000 UTC m=+1117.095912430" watchObservedRunningTime="2025-09-30 07:52:11.458605401 +0000 UTC m=+1117.101511813" Sep 30 07:52:11 crc kubenswrapper[4760]: I0930 07:52:11.489516 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.4894780389999998 podStartE2EDuration="3.489478039s" podCreationTimestamp="2025-09-30 07:52:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:52:11.477397541 +0000 UTC m=+1117.120303973" watchObservedRunningTime="2025-09-30 07:52:11.489478039 +0000 UTC m=+1117.132384501" Sep 30 07:52:12 crc kubenswrapper[4760]: I0930 07:52:12.441667 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 30 07:52:15 crc kubenswrapper[4760]: I0930 07:52:15.478235 4760 generic.go:334] "Generic (PLEG): container finished" podID="152b47cf-da92-44c1-9b68-90cc849f4b74" containerID="478b02568ea9ae9e32dccd85f91feb1777b134982af0cc2b37f6ebb0703477c8" exitCode=0 Sep 30 07:52:15 crc kubenswrapper[4760]: I0930 07:52:15.478328 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wsmzq" event={"ID":"152b47cf-da92-44c1-9b68-90cc849f4b74","Type":"ContainerDied","Data":"478b02568ea9ae9e32dccd85f91feb1777b134982af0cc2b37f6ebb0703477c8"} Sep 30 07:52:16 crc kubenswrapper[4760]: I0930 07:52:16.972422 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wsmzq" Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.164861 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152b47cf-da92-44c1-9b68-90cc849f4b74-combined-ca-bundle\") pod \"152b47cf-da92-44c1-9b68-90cc849f4b74\" (UID: \"152b47cf-da92-44c1-9b68-90cc849f4b74\") " Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.164915 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggzfm\" (UniqueName: \"kubernetes.io/projected/152b47cf-da92-44c1-9b68-90cc849f4b74-kube-api-access-ggzfm\") pod \"152b47cf-da92-44c1-9b68-90cc849f4b74\" (UID: \"152b47cf-da92-44c1-9b68-90cc849f4b74\") " Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.165150 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/152b47cf-da92-44c1-9b68-90cc849f4b74-scripts\") pod \"152b47cf-da92-44c1-9b68-90cc849f4b74\" (UID: \"152b47cf-da92-44c1-9b68-90cc849f4b74\") " Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.165225 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/152b47cf-da92-44c1-9b68-90cc849f4b74-config-data\") pod \"152b47cf-da92-44c1-9b68-90cc849f4b74\" (UID: \"152b47cf-da92-44c1-9b68-90cc849f4b74\") " Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.172481 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/152b47cf-da92-44c1-9b68-90cc849f4b74-scripts" (OuterVolumeSpecName: "scripts") pod "152b47cf-da92-44c1-9b68-90cc849f4b74" (UID: "152b47cf-da92-44c1-9b68-90cc849f4b74"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.177274 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/152b47cf-da92-44c1-9b68-90cc849f4b74-kube-api-access-ggzfm" (OuterVolumeSpecName: "kube-api-access-ggzfm") pod "152b47cf-da92-44c1-9b68-90cc849f4b74" (UID: "152b47cf-da92-44c1-9b68-90cc849f4b74"). InnerVolumeSpecName "kube-api-access-ggzfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.202548 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/152b47cf-da92-44c1-9b68-90cc849f4b74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "152b47cf-da92-44c1-9b68-90cc849f4b74" (UID: "152b47cf-da92-44c1-9b68-90cc849f4b74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.218410 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/152b47cf-da92-44c1-9b68-90cc849f4b74-config-data" (OuterVolumeSpecName: "config-data") pod "152b47cf-da92-44c1-9b68-90cc849f4b74" (UID: "152b47cf-da92-44c1-9b68-90cc849f4b74"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.268192 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/152b47cf-da92-44c1-9b68-90cc849f4b74-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.268248 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/152b47cf-da92-44c1-9b68-90cc849f4b74-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.268269 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152b47cf-da92-44c1-9b68-90cc849f4b74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.268289 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggzfm\" (UniqueName: \"kubernetes.io/projected/152b47cf-da92-44c1-9b68-90cc849f4b74-kube-api-access-ggzfm\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.505217 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wsmzq" event={"ID":"152b47cf-da92-44c1-9b68-90cc849f4b74","Type":"ContainerDied","Data":"2c722d9351d7fd85839ca27f0b6cf5c80d5100388e75591584c6cdc3f69a6cc3"} Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.505264 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c722d9351d7fd85839ca27f0b6cf5c80d5100388e75591584c6cdc3f69a6cc3" Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.505279 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wsmzq" Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.623151 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 07:52:17 crc kubenswrapper[4760]: E0930 07:52:17.623624 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="152b47cf-da92-44c1-9b68-90cc849f4b74" containerName="nova-cell0-conductor-db-sync" Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.623645 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="152b47cf-da92-44c1-9b68-90cc849f4b74" containerName="nova-cell0-conductor-db-sync" Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.623856 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="152b47cf-da92-44c1-9b68-90cc849f4b74" containerName="nova-cell0-conductor-db-sync" Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.624576 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.628927 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.629113 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-fw8rm" Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.638263 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.778534 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1669957-f301-409c-8f6b-e1b87dfadeb7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d1669957-f301-409c-8f6b-e1b87dfadeb7\") " pod="openstack/nova-cell0-conductor-0" Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.778603 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5sds\" (UniqueName: \"kubernetes.io/projected/d1669957-f301-409c-8f6b-e1b87dfadeb7-kube-api-access-z5sds\") pod \"nova-cell0-conductor-0\" (UID: \"d1669957-f301-409c-8f6b-e1b87dfadeb7\") " pod="openstack/nova-cell0-conductor-0" Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.778724 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1669957-f301-409c-8f6b-e1b87dfadeb7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d1669957-f301-409c-8f6b-e1b87dfadeb7\") " pod="openstack/nova-cell0-conductor-0" Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.880499 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1669957-f301-409c-8f6b-e1b87dfadeb7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d1669957-f301-409c-8f6b-e1b87dfadeb7\") " pod="openstack/nova-cell0-conductor-0" Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.880553 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5sds\" (UniqueName: \"kubernetes.io/projected/d1669957-f301-409c-8f6b-e1b87dfadeb7-kube-api-access-z5sds\") pod \"nova-cell0-conductor-0\" (UID: \"d1669957-f301-409c-8f6b-e1b87dfadeb7\") " pod="openstack/nova-cell0-conductor-0" Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.880658 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1669957-f301-409c-8f6b-e1b87dfadeb7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d1669957-f301-409c-8f6b-e1b87dfadeb7\") " pod="openstack/nova-cell0-conductor-0" Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.890268 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1669957-f301-409c-8f6b-e1b87dfadeb7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d1669957-f301-409c-8f6b-e1b87dfadeb7\") " pod="openstack/nova-cell0-conductor-0" Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.890875 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1669957-f301-409c-8f6b-e1b87dfadeb7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d1669957-f301-409c-8f6b-e1b87dfadeb7\") " pod="openstack/nova-cell0-conductor-0" Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.904141 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5sds\" (UniqueName: \"kubernetes.io/projected/d1669957-f301-409c-8f6b-e1b87dfadeb7-kube-api-access-z5sds\") pod \"nova-cell0-conductor-0\" (UID: \"d1669957-f301-409c-8f6b-e1b87dfadeb7\") " pod="openstack/nova-cell0-conductor-0" Sep 30 07:52:17 crc kubenswrapper[4760]: I0930 07:52:17.943162 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 07:52:18 crc kubenswrapper[4760]: I0930 07:52:18.419399 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 07:52:18 crc kubenswrapper[4760]: I0930 07:52:18.516608 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d1669957-f301-409c-8f6b-e1b87dfadeb7","Type":"ContainerStarted","Data":"253c43a59544b6408d6d39fb2db64147d40e964eab2368af93bcd0194cdbe828"} Sep 30 07:52:19 crc kubenswrapper[4760]: I0930 07:52:19.112762 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:52:19 crc kubenswrapper[4760]: I0930 07:52:19.113207 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:52:19 crc kubenswrapper[4760]: I0930 07:52:19.113280 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 07:52:19 crc kubenswrapper[4760]: I0930 07:52:19.114443 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e08631326d3db4f9e31ecd2756775d73d9783f49875cc3f66b5e516f36754f34"} pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 07:52:19 crc kubenswrapper[4760]: I0930 07:52:19.114514 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" containerID="cri-o://e08631326d3db4f9e31ecd2756775d73d9783f49875cc3f66b5e516f36754f34" gracePeriod=600 Sep 30 07:52:19 crc kubenswrapper[4760]: I0930 07:52:19.552966 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d1669957-f301-409c-8f6b-e1b87dfadeb7","Type":"ContainerStarted","Data":"e612e5a8d27ed1fdc307f159bf32d3f723e470608b99799478aad1ca98dda985"} Sep 30 07:52:19 crc kubenswrapper[4760]: I0930 07:52:19.553596 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Sep 30 07:52:19 crc kubenswrapper[4760]: I0930 07:52:19.558442 4760 generic.go:334] "Generic (PLEG): container finished" podID="7a9c8270-6964-4886-87d0-227b05b76da4" containerID="e08631326d3db4f9e31ecd2756775d73d9783f49875cc3f66b5e516f36754f34" exitCode=0 Sep 30 07:52:19 crc kubenswrapper[4760]: I0930 07:52:19.558616 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerDied","Data":"e08631326d3db4f9e31ecd2756775d73d9783f49875cc3f66b5e516f36754f34"} Sep 30 07:52:19 crc kubenswrapper[4760]: I0930 07:52:19.558728 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerStarted","Data":"dd30dd4d28eef306568f8541bb8d83a7c0af086ec623d77ff729c59fba19ae20"} Sep 30 07:52:19 crc kubenswrapper[4760]: I0930 07:52:19.558832 4760 scope.go:117] "RemoveContainer" containerID="9ca37c299871442165175723aa160e44f88bbaa555a2a12583d4390e53262571" Sep 30 07:52:19 crc kubenswrapper[4760]: I0930 07:52:19.574773 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.574755084 podStartE2EDuration="2.574755084s" podCreationTimestamp="2025-09-30 07:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:52:19.5679038 +0000 UTC m=+1125.210810222" watchObservedRunningTime="2025-09-30 07:52:19.574755084 +0000 UTC m=+1125.217661506" Sep 30 07:52:21 crc kubenswrapper[4760]: I0930 07:52:21.043198 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Sep 30 07:52:27 crc kubenswrapper[4760]: I0930 07:52:27.971106 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.433182 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-9t7c8"] Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.434652 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9t7c8" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.436267 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.437930 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.451115 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9t7c8"] Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.578906 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.580810 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.582792 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.593383 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.602523 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c34c9e7c-e366-4179-ab98-c01d5b2cfc3d-config-data\") pod \"nova-cell0-cell-mapping-9t7c8\" (UID: \"c34c9e7c-e366-4179-ab98-c01d5b2cfc3d\") " pod="openstack/nova-cell0-cell-mapping-9t7c8" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.602655 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl2vf\" (UniqueName: \"kubernetes.io/projected/c34c9e7c-e366-4179-ab98-c01d5b2cfc3d-kube-api-access-jl2vf\") pod \"nova-cell0-cell-mapping-9t7c8\" (UID: \"c34c9e7c-e366-4179-ab98-c01d5b2cfc3d\") " pod="openstack/nova-cell0-cell-mapping-9t7c8" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.602742 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c34c9e7c-e366-4179-ab98-c01d5b2cfc3d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9t7c8\" (UID: \"c34c9e7c-e366-4179-ab98-c01d5b2cfc3d\") " pod="openstack/nova-cell0-cell-mapping-9t7c8" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.602776 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c34c9e7c-e366-4179-ab98-c01d5b2cfc3d-scripts\") pod \"nova-cell0-cell-mapping-9t7c8\" (UID: \"c34c9e7c-e366-4179-ab98-c01d5b2cfc3d\") " pod="openstack/nova-cell0-cell-mapping-9t7c8" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.704774 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl2vf\" (UniqueName: \"kubernetes.io/projected/c34c9e7c-e366-4179-ab98-c01d5b2cfc3d-kube-api-access-jl2vf\") pod \"nova-cell0-cell-mapping-9t7c8\" (UID: \"c34c9e7c-e366-4179-ab98-c01d5b2cfc3d\") " pod="openstack/nova-cell0-cell-mapping-9t7c8" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.704904 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0976286-8bdd-43e7-894c-3e899a323ee7-config-data\") pod \"nova-scheduler-0\" (UID: \"f0976286-8bdd-43e7-894c-3e899a323ee7\") " pod="openstack/nova-scheduler-0" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.704946 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c34c9e7c-e366-4179-ab98-c01d5b2cfc3d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9t7c8\" (UID: \"c34c9e7c-e366-4179-ab98-c01d5b2cfc3d\") " pod="openstack/nova-cell0-cell-mapping-9t7c8" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.705016 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c34c9e7c-e366-4179-ab98-c01d5b2cfc3d-scripts\") pod \"nova-cell0-cell-mapping-9t7c8\" (UID: \"c34c9e7c-e366-4179-ab98-c01d5b2cfc3d\") " pod="openstack/nova-cell0-cell-mapping-9t7c8" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.705081 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c34c9e7c-e366-4179-ab98-c01d5b2cfc3d-config-data\") pod \"nova-cell0-cell-mapping-9t7c8\" (UID: \"c34c9e7c-e366-4179-ab98-c01d5b2cfc3d\") " pod="openstack/nova-cell0-cell-mapping-9t7c8" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.705147 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5n2v\" (UniqueName: \"kubernetes.io/projected/f0976286-8bdd-43e7-894c-3e899a323ee7-kube-api-access-n5n2v\") pod \"nova-scheduler-0\" (UID: \"f0976286-8bdd-43e7-894c-3e899a323ee7\") " pod="openstack/nova-scheduler-0" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.705172 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0976286-8bdd-43e7-894c-3e899a323ee7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f0976286-8bdd-43e7-894c-3e899a323ee7\") " pod="openstack/nova-scheduler-0" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.716757 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c34c9e7c-e366-4179-ab98-c01d5b2cfc3d-scripts\") pod \"nova-cell0-cell-mapping-9t7c8\" (UID: \"c34c9e7c-e366-4179-ab98-c01d5b2cfc3d\") " pod="openstack/nova-cell0-cell-mapping-9t7c8" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.716777 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c34c9e7c-e366-4179-ab98-c01d5b2cfc3d-config-data\") pod \"nova-cell0-cell-mapping-9t7c8\" (UID: \"c34c9e7c-e366-4179-ab98-c01d5b2cfc3d\") " pod="openstack/nova-cell0-cell-mapping-9t7c8" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.730897 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c34c9e7c-e366-4179-ab98-c01d5b2cfc3d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9t7c8\" (UID: \"c34c9e7c-e366-4179-ab98-c01d5b2cfc3d\") " pod="openstack/nova-cell0-cell-mapping-9t7c8" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.737864 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.739441 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.741550 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.757714 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl2vf\" (UniqueName: \"kubernetes.io/projected/c34c9e7c-e366-4179-ab98-c01d5b2cfc3d-kube-api-access-jl2vf\") pod \"nova-cell0-cell-mapping-9t7c8\" (UID: \"c34c9e7c-e366-4179-ab98-c01d5b2cfc3d\") " pod="openstack/nova-cell0-cell-mapping-9t7c8" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.758855 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9t7c8" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.768350 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.794456 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.796047 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.804965 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.806550 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0976286-8bdd-43e7-894c-3e899a323ee7-config-data\") pod \"nova-scheduler-0\" (UID: \"f0976286-8bdd-43e7-894c-3e899a323ee7\") " pod="openstack/nova-scheduler-0" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.806682 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5n2v\" (UniqueName: \"kubernetes.io/projected/f0976286-8bdd-43e7-894c-3e899a323ee7-kube-api-access-n5n2v\") pod \"nova-scheduler-0\" (UID: \"f0976286-8bdd-43e7-894c-3e899a323ee7\") " pod="openstack/nova-scheduler-0" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.806704 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0976286-8bdd-43e7-894c-3e899a323ee7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f0976286-8bdd-43e7-894c-3e899a323ee7\") " pod="openstack/nova-scheduler-0" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.809795 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0976286-8bdd-43e7-894c-3e899a323ee7-config-data\") pod \"nova-scheduler-0\" (UID: \"f0976286-8bdd-43e7-894c-3e899a323ee7\") " pod="openstack/nova-scheduler-0" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.832707 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0976286-8bdd-43e7-894c-3e899a323ee7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f0976286-8bdd-43e7-894c-3e899a323ee7\") " pod="openstack/nova-scheduler-0" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.836702 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.847089 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5n2v\" (UniqueName: \"kubernetes.io/projected/f0976286-8bdd-43e7-894c-3e899a323ee7-kube-api-access-n5n2v\") pod \"nova-scheduler-0\" (UID: \"f0976286-8bdd-43e7-894c-3e899a323ee7\") " pod="openstack/nova-scheduler-0" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.852711 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.854250 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.858599 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.873813 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.895027 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.908793 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56921bc4-752f-4605-9bf5-af56dbd217d4-config-data\") pod \"nova-metadata-0\" (UID: \"56921bc4-752f-4605-9bf5-af56dbd217d4\") " pod="openstack/nova-metadata-0" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.909089 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56921bc4-752f-4605-9bf5-af56dbd217d4-logs\") pod \"nova-metadata-0\" (UID: \"56921bc4-752f-4605-9bf5-af56dbd217d4\") " pod="openstack/nova-metadata-0" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.909120 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkbnp\" (UniqueName: \"kubernetes.io/projected/93652508-9731-4349-b689-3d7cda972bda-kube-api-access-lkbnp\") pod \"nova-api-0\" (UID: \"93652508-9731-4349-b689-3d7cda972bda\") " pod="openstack/nova-api-0" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.909139 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56921bc4-752f-4605-9bf5-af56dbd217d4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"56921bc4-752f-4605-9bf5-af56dbd217d4\") " pod="openstack/nova-metadata-0" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.909163 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93652508-9731-4349-b689-3d7cda972bda-config-data\") pod \"nova-api-0\" (UID: \"93652508-9731-4349-b689-3d7cda972bda\") " pod="openstack/nova-api-0" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.909194 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93652508-9731-4349-b689-3d7cda972bda-logs\") pod \"nova-api-0\" (UID: \"93652508-9731-4349-b689-3d7cda972bda\") " pod="openstack/nova-api-0" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.909222 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93652508-9731-4349-b689-3d7cda972bda-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"93652508-9731-4349-b689-3d7cda972bda\") " pod="openstack/nova-api-0" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.909474 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7btjc\" (UniqueName: \"kubernetes.io/projected/56921bc4-752f-4605-9bf5-af56dbd217d4-kube-api-access-7btjc\") pod \"nova-metadata-0\" (UID: \"56921bc4-752f-4605-9bf5-af56dbd217d4\") " pod="openstack/nova-metadata-0" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.962475 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-g6596"] Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.964085 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-g6596" Sep 30 07:52:28 crc kubenswrapper[4760]: I0930 07:52:28.989353 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-g6596"] Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.011462 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5fec5af-a95f-4659-bf0f-e57806fa05c7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5fec5af-a95f-4659-bf0f-e57806fa05c7\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.011563 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5fec5af-a95f-4659-bf0f-e57806fa05c7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5fec5af-a95f-4659-bf0f-e57806fa05c7\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.011595 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56921bc4-752f-4605-9bf5-af56dbd217d4-config-data\") pod \"nova-metadata-0\" (UID: \"56921bc4-752f-4605-9bf5-af56dbd217d4\") " pod="openstack/nova-metadata-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.011628 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56921bc4-752f-4605-9bf5-af56dbd217d4-logs\") pod \"nova-metadata-0\" (UID: \"56921bc4-752f-4605-9bf5-af56dbd217d4\") " pod="openstack/nova-metadata-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.011663 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkbnp\" (UniqueName: \"kubernetes.io/projected/93652508-9731-4349-b689-3d7cda972bda-kube-api-access-lkbnp\") pod \"nova-api-0\" (UID: \"93652508-9731-4349-b689-3d7cda972bda\") " pod="openstack/nova-api-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.011682 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56921bc4-752f-4605-9bf5-af56dbd217d4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"56921bc4-752f-4605-9bf5-af56dbd217d4\") " pod="openstack/nova-metadata-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.011710 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93652508-9731-4349-b689-3d7cda972bda-config-data\") pod \"nova-api-0\" (UID: \"93652508-9731-4349-b689-3d7cda972bda\") " pod="openstack/nova-api-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.011749 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93652508-9731-4349-b689-3d7cda972bda-logs\") pod \"nova-api-0\" (UID: \"93652508-9731-4349-b689-3d7cda972bda\") " pod="openstack/nova-api-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.011768 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvzx2\" (UniqueName: \"kubernetes.io/projected/d5fec5af-a95f-4659-bf0f-e57806fa05c7-kube-api-access-pvzx2\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5fec5af-a95f-4659-bf0f-e57806fa05c7\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.011803 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93652508-9731-4349-b689-3d7cda972bda-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"93652508-9731-4349-b689-3d7cda972bda\") " pod="openstack/nova-api-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.011898 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7btjc\" (UniqueName: \"kubernetes.io/projected/56921bc4-752f-4605-9bf5-af56dbd217d4-kube-api-access-7btjc\") pod \"nova-metadata-0\" (UID: \"56921bc4-752f-4605-9bf5-af56dbd217d4\") " pod="openstack/nova-metadata-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.018218 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56921bc4-752f-4605-9bf5-af56dbd217d4-logs\") pod \"nova-metadata-0\" (UID: \"56921bc4-752f-4605-9bf5-af56dbd217d4\") " pod="openstack/nova-metadata-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.018508 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93652508-9731-4349-b689-3d7cda972bda-logs\") pod \"nova-api-0\" (UID: \"93652508-9731-4349-b689-3d7cda972bda\") " pod="openstack/nova-api-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.024339 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56921bc4-752f-4605-9bf5-af56dbd217d4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"56921bc4-752f-4605-9bf5-af56dbd217d4\") " pod="openstack/nova-metadata-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.025524 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93652508-9731-4349-b689-3d7cda972bda-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"93652508-9731-4349-b689-3d7cda972bda\") " pod="openstack/nova-api-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.029769 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56921bc4-752f-4605-9bf5-af56dbd217d4-config-data\") pod \"nova-metadata-0\" (UID: \"56921bc4-752f-4605-9bf5-af56dbd217d4\") " pod="openstack/nova-metadata-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.030343 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93652508-9731-4349-b689-3d7cda972bda-config-data\") pod \"nova-api-0\" (UID: \"93652508-9731-4349-b689-3d7cda972bda\") " pod="openstack/nova-api-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.037067 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7btjc\" (UniqueName: \"kubernetes.io/projected/56921bc4-752f-4605-9bf5-af56dbd217d4-kube-api-access-7btjc\") pod \"nova-metadata-0\" (UID: \"56921bc4-752f-4605-9bf5-af56dbd217d4\") " pod="openstack/nova-metadata-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.040818 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkbnp\" (UniqueName: \"kubernetes.io/projected/93652508-9731-4349-b689-3d7cda972bda-kube-api-access-lkbnp\") pod \"nova-api-0\" (UID: \"93652508-9731-4349-b689-3d7cda972bda\") " pod="openstack/nova-api-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.115392 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-g6596\" (UID: \"ce9663f0-6f75-4dd2-bb60-faf428321df0\") " pod="openstack/dnsmasq-dns-865f5d856f-g6596" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.115668 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvzx2\" (UniqueName: \"kubernetes.io/projected/d5fec5af-a95f-4659-bf0f-e57806fa05c7-kube-api-access-pvzx2\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5fec5af-a95f-4659-bf0f-e57806fa05c7\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.115718 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-g6596\" (UID: \"ce9663f0-6f75-4dd2-bb60-faf428321df0\") " pod="openstack/dnsmasq-dns-865f5d856f-g6596" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.115777 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-g6596\" (UID: \"ce9663f0-6f75-4dd2-bb60-faf428321df0\") " pod="openstack/dnsmasq-dns-865f5d856f-g6596" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.115812 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5fec5af-a95f-4659-bf0f-e57806fa05c7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5fec5af-a95f-4659-bf0f-e57806fa05c7\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.115835 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f24ph\" (UniqueName: \"kubernetes.io/projected/ce9663f0-6f75-4dd2-bb60-faf428321df0-kube-api-access-f24ph\") pod \"dnsmasq-dns-865f5d856f-g6596\" (UID: \"ce9663f0-6f75-4dd2-bb60-faf428321df0\") " pod="openstack/dnsmasq-dns-865f5d856f-g6596" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.115858 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-dns-svc\") pod \"dnsmasq-dns-865f5d856f-g6596\" (UID: \"ce9663f0-6f75-4dd2-bb60-faf428321df0\") " pod="openstack/dnsmasq-dns-865f5d856f-g6596" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.115876 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5fec5af-a95f-4659-bf0f-e57806fa05c7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5fec5af-a95f-4659-bf0f-e57806fa05c7\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.115892 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-config\") pod \"dnsmasq-dns-865f5d856f-g6596\" (UID: \"ce9663f0-6f75-4dd2-bb60-faf428321df0\") " pod="openstack/dnsmasq-dns-865f5d856f-g6596" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.127937 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5fec5af-a95f-4659-bf0f-e57806fa05c7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5fec5af-a95f-4659-bf0f-e57806fa05c7\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.128068 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5fec5af-a95f-4659-bf0f-e57806fa05c7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5fec5af-a95f-4659-bf0f-e57806fa05c7\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.131108 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvzx2\" (UniqueName: \"kubernetes.io/projected/d5fec5af-a95f-4659-bf0f-e57806fa05c7-kube-api-access-pvzx2\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5fec5af-a95f-4659-bf0f-e57806fa05c7\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.219167 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.219410 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-g6596\" (UID: \"ce9663f0-6f75-4dd2-bb60-faf428321df0\") " pod="openstack/dnsmasq-dns-865f5d856f-g6596" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.219503 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-g6596\" (UID: \"ce9663f0-6f75-4dd2-bb60-faf428321df0\") " pod="openstack/dnsmasq-dns-865f5d856f-g6596" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.219541 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f24ph\" (UniqueName: \"kubernetes.io/projected/ce9663f0-6f75-4dd2-bb60-faf428321df0-kube-api-access-f24ph\") pod \"dnsmasq-dns-865f5d856f-g6596\" (UID: \"ce9663f0-6f75-4dd2-bb60-faf428321df0\") " pod="openstack/dnsmasq-dns-865f5d856f-g6596" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.219561 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-dns-svc\") pod \"dnsmasq-dns-865f5d856f-g6596\" (UID: \"ce9663f0-6f75-4dd2-bb60-faf428321df0\") " pod="openstack/dnsmasq-dns-865f5d856f-g6596" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.219580 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-config\") pod \"dnsmasq-dns-865f5d856f-g6596\" (UID: \"ce9663f0-6f75-4dd2-bb60-faf428321df0\") " pod="openstack/dnsmasq-dns-865f5d856f-g6596" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.219648 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-g6596\" (UID: \"ce9663f0-6f75-4dd2-bb60-faf428321df0\") " pod="openstack/dnsmasq-dns-865f5d856f-g6596" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.221114 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-g6596\" (UID: \"ce9663f0-6f75-4dd2-bb60-faf428321df0\") " pod="openstack/dnsmasq-dns-865f5d856f-g6596" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.221162 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-g6596\" (UID: \"ce9663f0-6f75-4dd2-bb60-faf428321df0\") " pod="openstack/dnsmasq-dns-865f5d856f-g6596" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.222535 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-config\") pod \"dnsmasq-dns-865f5d856f-g6596\" (UID: \"ce9663f0-6f75-4dd2-bb60-faf428321df0\") " pod="openstack/dnsmasq-dns-865f5d856f-g6596" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.223832 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-g6596\" (UID: \"ce9663f0-6f75-4dd2-bb60-faf428321df0\") " pod="openstack/dnsmasq-dns-865f5d856f-g6596" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.224704 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-dns-svc\") pod \"dnsmasq-dns-865f5d856f-g6596\" (UID: \"ce9663f0-6f75-4dd2-bb60-faf428321df0\") " pod="openstack/dnsmasq-dns-865f5d856f-g6596" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.231810 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.238982 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f24ph\" (UniqueName: \"kubernetes.io/projected/ce9663f0-6f75-4dd2-bb60-faf428321df0-kube-api-access-f24ph\") pod \"dnsmasq-dns-865f5d856f-g6596\" (UID: \"ce9663f0-6f75-4dd2-bb60-faf428321df0\") " pod="openstack/dnsmasq-dns-865f5d856f-g6596" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.293771 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.300464 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-g6596" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.464902 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9t7c8"] Sep 30 07:52:29 crc kubenswrapper[4760]: W0930 07:52:29.585448 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0976286_8bdd_43e7_894c_3e899a323ee7.slice/crio-bf39715a3c6ae59463830e381824b5489226d4113d86548f053ed6cc39c6f69d WatchSource:0}: Error finding container bf39715a3c6ae59463830e381824b5489226d4113d86548f053ed6cc39c6f69d: Status 404 returned error can't find the container with id bf39715a3c6ae59463830e381824b5489226d4113d86548f053ed6cc39c6f69d Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.595103 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.595702 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7jv2q"] Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.597626 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7jv2q" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.600462 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.600614 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.603706 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7jv2q"] Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.710918 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9t7c8" event={"ID":"c34c9e7c-e366-4179-ab98-c01d5b2cfc3d","Type":"ContainerStarted","Data":"e2fd7aa4296d95973edc65e9e3cc427d6fbf89c4fe786780f01c1a70f4e046b5"} Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.712479 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f0976286-8bdd-43e7-894c-3e899a323ee7","Type":"ContainerStarted","Data":"bf39715a3c6ae59463830e381824b5489226d4113d86548f053ed6cc39c6f69d"} Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.745239 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvd4x\" (UniqueName: \"kubernetes.io/projected/462eb909-9878-48ca-a8fc-93b0e6b6d7f7-kube-api-access-xvd4x\") pod \"nova-cell1-conductor-db-sync-7jv2q\" (UID: \"462eb909-9878-48ca-a8fc-93b0e6b6d7f7\") " pod="openstack/nova-cell1-conductor-db-sync-7jv2q" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.745330 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/462eb909-9878-48ca-a8fc-93b0e6b6d7f7-config-data\") pod \"nova-cell1-conductor-db-sync-7jv2q\" (UID: \"462eb909-9878-48ca-a8fc-93b0e6b6d7f7\") " pod="openstack/nova-cell1-conductor-db-sync-7jv2q" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.745385 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/462eb909-9878-48ca-a8fc-93b0e6b6d7f7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7jv2q\" (UID: \"462eb909-9878-48ca-a8fc-93b0e6b6d7f7\") " pod="openstack/nova-cell1-conductor-db-sync-7jv2q" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.745410 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/462eb909-9878-48ca-a8fc-93b0e6b6d7f7-scripts\") pod \"nova-cell1-conductor-db-sync-7jv2q\" (UID: \"462eb909-9878-48ca-a8fc-93b0e6b6d7f7\") " pod="openstack/nova-cell1-conductor-db-sync-7jv2q" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.750908 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.848027 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvd4x\" (UniqueName: \"kubernetes.io/projected/462eb909-9878-48ca-a8fc-93b0e6b6d7f7-kube-api-access-xvd4x\") pod \"nova-cell1-conductor-db-sync-7jv2q\" (UID: \"462eb909-9878-48ca-a8fc-93b0e6b6d7f7\") " pod="openstack/nova-cell1-conductor-db-sync-7jv2q" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.848371 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/462eb909-9878-48ca-a8fc-93b0e6b6d7f7-config-data\") pod \"nova-cell1-conductor-db-sync-7jv2q\" (UID: \"462eb909-9878-48ca-a8fc-93b0e6b6d7f7\") " pod="openstack/nova-cell1-conductor-db-sync-7jv2q" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.848417 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/462eb909-9878-48ca-a8fc-93b0e6b6d7f7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7jv2q\" (UID: \"462eb909-9878-48ca-a8fc-93b0e6b6d7f7\") " pod="openstack/nova-cell1-conductor-db-sync-7jv2q" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.848439 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/462eb909-9878-48ca-a8fc-93b0e6b6d7f7-scripts\") pod \"nova-cell1-conductor-db-sync-7jv2q\" (UID: \"462eb909-9878-48ca-a8fc-93b0e6b6d7f7\") " pod="openstack/nova-cell1-conductor-db-sync-7jv2q" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.853459 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/462eb909-9878-48ca-a8fc-93b0e6b6d7f7-config-data\") pod \"nova-cell1-conductor-db-sync-7jv2q\" (UID: \"462eb909-9878-48ca-a8fc-93b0e6b6d7f7\") " pod="openstack/nova-cell1-conductor-db-sync-7jv2q" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.864572 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/462eb909-9878-48ca-a8fc-93b0e6b6d7f7-scripts\") pod \"nova-cell1-conductor-db-sync-7jv2q\" (UID: \"462eb909-9878-48ca-a8fc-93b0e6b6d7f7\") " pod="openstack/nova-cell1-conductor-db-sync-7jv2q" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.865638 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/462eb909-9878-48ca-a8fc-93b0e6b6d7f7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7jv2q\" (UID: \"462eb909-9878-48ca-a8fc-93b0e6b6d7f7\") " pod="openstack/nova-cell1-conductor-db-sync-7jv2q" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.866448 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvd4x\" (UniqueName: \"kubernetes.io/projected/462eb909-9878-48ca-a8fc-93b0e6b6d7f7-kube-api-access-xvd4x\") pod \"nova-cell1-conductor-db-sync-7jv2q\" (UID: \"462eb909-9878-48ca-a8fc-93b0e6b6d7f7\") " pod="openstack/nova-cell1-conductor-db-sync-7jv2q" Sep 30 07:52:29 crc kubenswrapper[4760]: I0930 07:52:29.920231 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7jv2q" Sep 30 07:52:30 crc kubenswrapper[4760]: I0930 07:52:29.998419 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-g6596"] Sep 30 07:52:30 crc kubenswrapper[4760]: I0930 07:52:30.163495 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 07:52:30 crc kubenswrapper[4760]: I0930 07:52:30.185524 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 07:52:30 crc kubenswrapper[4760]: W0930 07:52:30.252803 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56921bc4_752f_4605_9bf5_af56dbd217d4.slice/crio-750be7609d607a2155869459923ab798128b6d19a18b0f849ce6f0327b67b9fe WatchSource:0}: Error finding container 750be7609d607a2155869459923ab798128b6d19a18b0f849ce6f0327b67b9fe: Status 404 returned error can't find the container with id 750be7609d607a2155869459923ab798128b6d19a18b0f849ce6f0327b67b9fe Sep 30 07:52:30 crc kubenswrapper[4760]: I0930 07:52:30.615720 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7jv2q"] Sep 30 07:52:30 crc kubenswrapper[4760]: W0930 07:52:30.618333 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod462eb909_9878_48ca_a8fc_93b0e6b6d7f7.slice/crio-18246020b391b2a154a5ebe2bf559b9531bc8b274a2b19a66732ba5812233c41 WatchSource:0}: Error finding container 18246020b391b2a154a5ebe2bf559b9531bc8b274a2b19a66732ba5812233c41: Status 404 returned error can't find the container with id 18246020b391b2a154a5ebe2bf559b9531bc8b274a2b19a66732ba5812233c41 Sep 30 07:52:30 crc kubenswrapper[4760]: I0930 07:52:30.753905 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"93652508-9731-4349-b689-3d7cda972bda","Type":"ContainerStarted","Data":"a75ba9a1c8b4a3580da473e55cf5e4264126e35640f9a972a09cdd9611a768c5"} Sep 30 07:52:30 crc kubenswrapper[4760]: I0930 07:52:30.757762 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9t7c8" event={"ID":"c34c9e7c-e366-4179-ab98-c01d5b2cfc3d","Type":"ContainerStarted","Data":"b97542524e5f69ec62652adadc07d21dc16b82f150191c1c3d5136520b73aba7"} Sep 30 07:52:30 crc kubenswrapper[4760]: I0930 07:52:30.764130 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56921bc4-752f-4605-9bf5-af56dbd217d4","Type":"ContainerStarted","Data":"750be7609d607a2155869459923ab798128b6d19a18b0f849ce6f0327b67b9fe"} Sep 30 07:52:30 crc kubenswrapper[4760]: I0930 07:52:30.775764 4760 generic.go:334] "Generic (PLEG): container finished" podID="ce9663f0-6f75-4dd2-bb60-faf428321df0" containerID="852c3e852888c560a155ec12eaca30e1d600594862e74176527525d120547e75" exitCode=0 Sep 30 07:52:30 crc kubenswrapper[4760]: I0930 07:52:30.775826 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-g6596" event={"ID":"ce9663f0-6f75-4dd2-bb60-faf428321df0","Type":"ContainerDied","Data":"852c3e852888c560a155ec12eaca30e1d600594862e74176527525d120547e75"} Sep 30 07:52:30 crc kubenswrapper[4760]: I0930 07:52:30.775850 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-g6596" event={"ID":"ce9663f0-6f75-4dd2-bb60-faf428321df0","Type":"ContainerStarted","Data":"7c8618ed1e723e71d30c9a74423ce6a114f6d7a851e78b24819d14b1647bb0e7"} Sep 30 07:52:30 crc kubenswrapper[4760]: I0930 07:52:30.783213 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7jv2q" event={"ID":"462eb909-9878-48ca-a8fc-93b0e6b6d7f7","Type":"ContainerStarted","Data":"18246020b391b2a154a5ebe2bf559b9531bc8b274a2b19a66732ba5812233c41"} Sep 30 07:52:30 crc kubenswrapper[4760]: I0930 07:52:30.790767 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d5fec5af-a95f-4659-bf0f-e57806fa05c7","Type":"ContainerStarted","Data":"33db13328c7baffaa81aca9f9823ec821246f06d983647243775dbb9ca0dfed3"} Sep 30 07:52:30 crc kubenswrapper[4760]: I0930 07:52:30.801141 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-9t7c8" podStartSLOduration=2.801121073 podStartE2EDuration="2.801121073s" podCreationTimestamp="2025-09-30 07:52:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:52:30.777064989 +0000 UTC m=+1136.419971401" watchObservedRunningTime="2025-09-30 07:52:30.801121073 +0000 UTC m=+1136.444027485" Sep 30 07:52:31 crc kubenswrapper[4760]: I0930 07:52:31.810440 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-g6596" event={"ID":"ce9663f0-6f75-4dd2-bb60-faf428321df0","Type":"ContainerStarted","Data":"891ea8af8a08e1995052ece449e6d3ad7ac43b37140fd454cb1de02f6360e8de"} Sep 30 07:52:31 crc kubenswrapper[4760]: I0930 07:52:31.810758 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-865f5d856f-g6596" Sep 30 07:52:31 crc kubenswrapper[4760]: I0930 07:52:31.818473 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7jv2q" event={"ID":"462eb909-9878-48ca-a8fc-93b0e6b6d7f7","Type":"ContainerStarted","Data":"4e75feedf602ccc119e27c65423fb9dd78adee09529c1a10d43048a36aa5ea6f"} Sep 30 07:52:31 crc kubenswrapper[4760]: I0930 07:52:31.841981 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-865f5d856f-g6596" podStartSLOduration=3.841962933 podStartE2EDuration="3.841962933s" podCreationTimestamp="2025-09-30 07:52:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:52:31.834275217 +0000 UTC m=+1137.477181629" watchObservedRunningTime="2025-09-30 07:52:31.841962933 +0000 UTC m=+1137.484869365" Sep 30 07:52:31 crc kubenswrapper[4760]: I0930 07:52:31.866040 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-7jv2q" podStartSLOduration=2.866017517 podStartE2EDuration="2.866017517s" podCreationTimestamp="2025-09-30 07:52:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:52:31.850602654 +0000 UTC m=+1137.493509066" watchObservedRunningTime="2025-09-30 07:52:31.866017517 +0000 UTC m=+1137.508923929" Sep 30 07:52:32 crc kubenswrapper[4760]: I0930 07:52:32.557644 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 07:52:32 crc kubenswrapper[4760]: I0930 07:52:32.569231 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 07:52:33 crc kubenswrapper[4760]: I0930 07:52:33.843928 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56921bc4-752f-4605-9bf5-af56dbd217d4","Type":"ContainerStarted","Data":"22d78f6325b5cbcaeb5aee35041eaeded67bfd03fd7c20d2b14b4614a1f95ee4"} Sep 30 07:52:33 crc kubenswrapper[4760]: I0930 07:52:33.844541 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56921bc4-752f-4605-9bf5-af56dbd217d4","Type":"ContainerStarted","Data":"024d726c57758210a0b9cd176dec0dbf8a1feaa6e2df4cefe80f29d2802cf72e"} Sep 30 07:52:33 crc kubenswrapper[4760]: I0930 07:52:33.844045 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="56921bc4-752f-4605-9bf5-af56dbd217d4" containerName="nova-metadata-metadata" containerID="cri-o://22d78f6325b5cbcaeb5aee35041eaeded67bfd03fd7c20d2b14b4614a1f95ee4" gracePeriod=30 Sep 30 07:52:33 crc kubenswrapper[4760]: I0930 07:52:33.844005 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="56921bc4-752f-4605-9bf5-af56dbd217d4" containerName="nova-metadata-log" containerID="cri-o://024d726c57758210a0b9cd176dec0dbf8a1feaa6e2df4cefe80f29d2802cf72e" gracePeriod=30 Sep 30 07:52:33 crc kubenswrapper[4760]: I0930 07:52:33.846489 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f0976286-8bdd-43e7-894c-3e899a323ee7","Type":"ContainerStarted","Data":"1d3d1f847bbdb09ebb495e4366300be15fc716ea055c4f1090d869713fd4a638"} Sep 30 07:52:33 crc kubenswrapper[4760]: I0930 07:52:33.856063 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d5fec5af-a95f-4659-bf0f-e57806fa05c7","Type":"ContainerStarted","Data":"e5d0927efdc95d1021664f825a01aba7d3eec632230ec3412f5085ea4af22e0e"} Sep 30 07:52:33 crc kubenswrapper[4760]: I0930 07:52:33.856245 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="d5fec5af-a95f-4659-bf0f-e57806fa05c7" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://e5d0927efdc95d1021664f825a01aba7d3eec632230ec3412f5085ea4af22e0e" gracePeriod=30 Sep 30 07:52:33 crc kubenswrapper[4760]: I0930 07:52:33.858435 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"93652508-9731-4349-b689-3d7cda972bda","Type":"ContainerStarted","Data":"897c1c308c3fba3c7cfa72f91e90072ab60375c98b82465d97aec0982d58a466"} Sep 30 07:52:33 crc kubenswrapper[4760]: I0930 07:52:33.858462 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"93652508-9731-4349-b689-3d7cda972bda","Type":"ContainerStarted","Data":"50b30a83024ea4144118dad1a713f5a7170e709e15d89cbae2ff7f08087ab03a"} Sep 30 07:52:33 crc kubenswrapper[4760]: I0930 07:52:33.869618 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.020026909 podStartE2EDuration="5.869592243s" podCreationTimestamp="2025-09-30 07:52:28 +0000 UTC" firstStartedPulling="2025-09-30 07:52:30.263537605 +0000 UTC m=+1135.906444017" lastFinishedPulling="2025-09-30 07:52:33.113102929 +0000 UTC m=+1138.756009351" observedRunningTime="2025-09-30 07:52:33.865265502 +0000 UTC m=+1139.508171914" watchObservedRunningTime="2025-09-30 07:52:33.869592243 +0000 UTC m=+1139.512498655" Sep 30 07:52:33 crc kubenswrapper[4760]: I0930 07:52:33.890329 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.03966867 podStartE2EDuration="5.890300671s" podCreationTimestamp="2025-09-30 07:52:28 +0000 UTC" firstStartedPulling="2025-09-30 07:52:30.263502584 +0000 UTC m=+1135.906408996" lastFinishedPulling="2025-09-30 07:52:33.114134585 +0000 UTC m=+1138.757040997" observedRunningTime="2025-09-30 07:52:33.888141396 +0000 UTC m=+1139.531047808" watchObservedRunningTime="2025-09-30 07:52:33.890300671 +0000 UTC m=+1139.533207083" Sep 30 07:52:33 crc kubenswrapper[4760]: I0930 07:52:33.896017 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 07:52:33 crc kubenswrapper[4760]: I0930 07:52:33.915687 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.398558521 podStartE2EDuration="5.915668459s" podCreationTimestamp="2025-09-30 07:52:28 +0000 UTC" firstStartedPulling="2025-09-30 07:52:29.595587941 +0000 UTC m=+1135.238494353" lastFinishedPulling="2025-09-30 07:52:33.112697879 +0000 UTC m=+1138.755604291" observedRunningTime="2025-09-30 07:52:33.902870652 +0000 UTC m=+1139.545777074" watchObservedRunningTime="2025-09-30 07:52:33.915668459 +0000 UTC m=+1139.558574871" Sep 30 07:52:33 crc kubenswrapper[4760]: I0930 07:52:33.928096 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.574564352 podStartE2EDuration="5.928074995s" podCreationTimestamp="2025-09-30 07:52:28 +0000 UTC" firstStartedPulling="2025-09-30 07:52:29.763958597 +0000 UTC m=+1135.406865009" lastFinishedPulling="2025-09-30 07:52:33.11746923 +0000 UTC m=+1138.760375652" observedRunningTime="2025-09-30 07:52:33.925185451 +0000 UTC m=+1139.568091863" watchObservedRunningTime="2025-09-30 07:52:33.928074995 +0000 UTC m=+1139.570981407" Sep 30 07:52:34 crc kubenswrapper[4760]: I0930 07:52:34.232639 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 07:52:34 crc kubenswrapper[4760]: I0930 07:52:34.232687 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 07:52:34 crc kubenswrapper[4760]: I0930 07:52:34.294707 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:52:34 crc kubenswrapper[4760]: I0930 07:52:34.869007 4760 generic.go:334] "Generic (PLEG): container finished" podID="56921bc4-752f-4605-9bf5-af56dbd217d4" containerID="024d726c57758210a0b9cd176dec0dbf8a1feaa6e2df4cefe80f29d2802cf72e" exitCode=143 Sep 30 07:52:34 crc kubenswrapper[4760]: I0930 07:52:34.869049 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56921bc4-752f-4605-9bf5-af56dbd217d4","Type":"ContainerDied","Data":"024d726c57758210a0b9cd176dec0dbf8a1feaa6e2df4cefe80f29d2802cf72e"} Sep 30 07:52:36 crc kubenswrapper[4760]: I0930 07:52:36.771917 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 07:52:37 crc kubenswrapper[4760]: I0930 07:52:37.906941 4760 generic.go:334] "Generic (PLEG): container finished" podID="c34c9e7c-e366-4179-ab98-c01d5b2cfc3d" containerID="b97542524e5f69ec62652adadc07d21dc16b82f150191c1c3d5136520b73aba7" exitCode=0 Sep 30 07:52:37 crc kubenswrapper[4760]: I0930 07:52:37.908290 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9t7c8" event={"ID":"c34c9e7c-e366-4179-ab98-c01d5b2cfc3d","Type":"ContainerDied","Data":"b97542524e5f69ec62652adadc07d21dc16b82f150191c1c3d5136520b73aba7"} Sep 30 07:52:38 crc kubenswrapper[4760]: I0930 07:52:38.896428 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 07:52:38 crc kubenswrapper[4760]: I0930 07:52:38.923859 4760 generic.go:334] "Generic (PLEG): container finished" podID="462eb909-9878-48ca-a8fc-93b0e6b6d7f7" containerID="4e75feedf602ccc119e27c65423fb9dd78adee09529c1a10d43048a36aa5ea6f" exitCode=0 Sep 30 07:52:38 crc kubenswrapper[4760]: I0930 07:52:38.923927 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7jv2q" event={"ID":"462eb909-9878-48ca-a8fc-93b0e6b6d7f7","Type":"ContainerDied","Data":"4e75feedf602ccc119e27c65423fb9dd78adee09529c1a10d43048a36aa5ea6f"} Sep 30 07:52:38 crc kubenswrapper[4760]: I0930 07:52:38.932207 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 07:52:38 crc kubenswrapper[4760]: I0930 07:52:38.986646 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 07:52:39 crc kubenswrapper[4760]: I0930 07:52:39.220213 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 07:52:39 crc kubenswrapper[4760]: I0930 07:52:39.220251 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 07:52:39 crc kubenswrapper[4760]: I0930 07:52:39.303153 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-865f5d856f-g6596" Sep 30 07:52:39 crc kubenswrapper[4760]: I0930 07:52:39.343836 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9t7c8" Sep 30 07:52:39 crc kubenswrapper[4760]: I0930 07:52:39.384646 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-pqzsz"] Sep 30 07:52:39 crc kubenswrapper[4760]: I0930 07:52:39.385330 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" podUID="a40ada2e-c162-4cdc-a530-e6708fccae5c" containerName="dnsmasq-dns" containerID="cri-o://b531b3ff6b4f7f017c89b9fc274d05538564829cf62fdd550e88eca675b3088f" gracePeriod=10 Sep 30 07:52:39 crc kubenswrapper[4760]: I0930 07:52:39.504552 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c34c9e7c-e366-4179-ab98-c01d5b2cfc3d-scripts\") pod \"c34c9e7c-e366-4179-ab98-c01d5b2cfc3d\" (UID: \"c34c9e7c-e366-4179-ab98-c01d5b2cfc3d\") " Sep 30 07:52:39 crc kubenswrapper[4760]: I0930 07:52:39.504690 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c34c9e7c-e366-4179-ab98-c01d5b2cfc3d-combined-ca-bundle\") pod \"c34c9e7c-e366-4179-ab98-c01d5b2cfc3d\" (UID: \"c34c9e7c-e366-4179-ab98-c01d5b2cfc3d\") " Sep 30 07:52:39 crc kubenswrapper[4760]: I0930 07:52:39.504763 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jl2vf\" (UniqueName: \"kubernetes.io/projected/c34c9e7c-e366-4179-ab98-c01d5b2cfc3d-kube-api-access-jl2vf\") pod \"c34c9e7c-e366-4179-ab98-c01d5b2cfc3d\" (UID: \"c34c9e7c-e366-4179-ab98-c01d5b2cfc3d\") " Sep 30 07:52:39 crc kubenswrapper[4760]: I0930 07:52:39.504859 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c34c9e7c-e366-4179-ab98-c01d5b2cfc3d-config-data\") pod \"c34c9e7c-e366-4179-ab98-c01d5b2cfc3d\" (UID: \"c34c9e7c-e366-4179-ab98-c01d5b2cfc3d\") " Sep 30 07:52:39 crc kubenswrapper[4760]: I0930 07:52:39.519961 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c34c9e7c-e366-4179-ab98-c01d5b2cfc3d-kube-api-access-jl2vf" (OuterVolumeSpecName: "kube-api-access-jl2vf") pod "c34c9e7c-e366-4179-ab98-c01d5b2cfc3d" (UID: "c34c9e7c-e366-4179-ab98-c01d5b2cfc3d"). InnerVolumeSpecName "kube-api-access-jl2vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:52:39 crc kubenswrapper[4760]: I0930 07:52:39.521591 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c34c9e7c-e366-4179-ab98-c01d5b2cfc3d-scripts" (OuterVolumeSpecName: "scripts") pod "c34c9e7c-e366-4179-ab98-c01d5b2cfc3d" (UID: "c34c9e7c-e366-4179-ab98-c01d5b2cfc3d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:39 crc kubenswrapper[4760]: I0930 07:52:39.560626 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c34c9e7c-e366-4179-ab98-c01d5b2cfc3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c34c9e7c-e366-4179-ab98-c01d5b2cfc3d" (UID: "c34c9e7c-e366-4179-ab98-c01d5b2cfc3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:39 crc kubenswrapper[4760]: I0930 07:52:39.579012 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c34c9e7c-e366-4179-ab98-c01d5b2cfc3d-config-data" (OuterVolumeSpecName: "config-data") pod "c34c9e7c-e366-4179-ab98-c01d5b2cfc3d" (UID: "c34c9e7c-e366-4179-ab98-c01d5b2cfc3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:39 crc kubenswrapper[4760]: I0930 07:52:39.606835 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c34c9e7c-e366-4179-ab98-c01d5b2cfc3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:39 crc kubenswrapper[4760]: I0930 07:52:39.606881 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jl2vf\" (UniqueName: \"kubernetes.io/projected/c34c9e7c-e366-4179-ab98-c01d5b2cfc3d-kube-api-access-jl2vf\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:39 crc kubenswrapper[4760]: I0930 07:52:39.606896 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c34c9e7c-e366-4179-ab98-c01d5b2cfc3d-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:39 crc kubenswrapper[4760]: I0930 07:52:39.606908 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c34c9e7c-e366-4179-ab98-c01d5b2cfc3d-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:39 crc kubenswrapper[4760]: I0930 07:52:39.940731 4760 generic.go:334] "Generic (PLEG): container finished" podID="a40ada2e-c162-4cdc-a530-e6708fccae5c" containerID="b531b3ff6b4f7f017c89b9fc274d05538564829cf62fdd550e88eca675b3088f" exitCode=0 Sep 30 07:52:39 crc kubenswrapper[4760]: I0930 07:52:39.940797 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" event={"ID":"a40ada2e-c162-4cdc-a530-e6708fccae5c","Type":"ContainerDied","Data":"b531b3ff6b4f7f017c89b9fc274d05538564829cf62fdd550e88eca675b3088f"} Sep 30 07:52:39 crc kubenswrapper[4760]: I0930 07:52:39.948072 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9t7c8" Sep 30 07:52:39 crc kubenswrapper[4760]: I0930 07:52:39.951505 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9t7c8" event={"ID":"c34c9e7c-e366-4179-ab98-c01d5b2cfc3d","Type":"ContainerDied","Data":"e2fd7aa4296d95973edc65e9e3cc427d6fbf89c4fe786780f01c1a70f4e046b5"} Sep 30 07:52:39 crc kubenswrapper[4760]: I0930 07:52:39.951539 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2fd7aa4296d95973edc65e9e3cc427d6fbf89c4fe786780f01c1a70f4e046b5" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.137898 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.147852 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.148051 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="93652508-9731-4349-b689-3d7cda972bda" containerName="nova-api-log" containerID="cri-o://50b30a83024ea4144118dad1a713f5a7170e709e15d89cbae2ff7f08087ab03a" gracePeriod=30 Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.148178 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="93652508-9731-4349-b689-3d7cda972bda" containerName="nova-api-api" containerID="cri-o://897c1c308c3fba3c7cfa72f91e90072ab60375c98b82465d97aec0982d58a466" gracePeriod=30 Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.194160 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="93652508-9731-4349-b689-3d7cda972bda" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": EOF" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.194457 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="93652508-9731-4349-b689-3d7cda972bda" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": EOF" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.202606 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.220945 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-dns-svc\") pod \"a40ada2e-c162-4cdc-a530-e6708fccae5c\" (UID: \"a40ada2e-c162-4cdc-a530-e6708fccae5c\") " Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.220997 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d82gs\" (UniqueName: \"kubernetes.io/projected/a40ada2e-c162-4cdc-a530-e6708fccae5c-kube-api-access-d82gs\") pod \"a40ada2e-c162-4cdc-a530-e6708fccae5c\" (UID: \"a40ada2e-c162-4cdc-a530-e6708fccae5c\") " Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.221033 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-dns-swift-storage-0\") pod \"a40ada2e-c162-4cdc-a530-e6708fccae5c\" (UID: \"a40ada2e-c162-4cdc-a530-e6708fccae5c\") " Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.221075 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-ovsdbserver-sb\") pod \"a40ada2e-c162-4cdc-a530-e6708fccae5c\" (UID: \"a40ada2e-c162-4cdc-a530-e6708fccae5c\") " Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.221191 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-config\") pod \"a40ada2e-c162-4cdc-a530-e6708fccae5c\" (UID: \"a40ada2e-c162-4cdc-a530-e6708fccae5c\") " Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.221220 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-ovsdbserver-nb\") pod \"a40ada2e-c162-4cdc-a530-e6708fccae5c\" (UID: \"a40ada2e-c162-4cdc-a530-e6708fccae5c\") " Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.282175 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a40ada2e-c162-4cdc-a530-e6708fccae5c-kube-api-access-d82gs" (OuterVolumeSpecName: "kube-api-access-d82gs") pod "a40ada2e-c162-4cdc-a530-e6708fccae5c" (UID: "a40ada2e-c162-4cdc-a530-e6708fccae5c"). InnerVolumeSpecName "kube-api-access-d82gs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.341272 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a40ada2e-c162-4cdc-a530-e6708fccae5c" (UID: "a40ada2e-c162-4cdc-a530-e6708fccae5c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.342769 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d82gs\" (UniqueName: \"kubernetes.io/projected/a40ada2e-c162-4cdc-a530-e6708fccae5c-kube-api-access-d82gs\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.342804 4760 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.402760 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-config" (OuterVolumeSpecName: "config") pod "a40ada2e-c162-4cdc-a530-e6708fccae5c" (UID: "a40ada2e-c162-4cdc-a530-e6708fccae5c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.442828 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a40ada2e-c162-4cdc-a530-e6708fccae5c" (UID: "a40ada2e-c162-4cdc-a530-e6708fccae5c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.444239 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a40ada2e-c162-4cdc-a530-e6708fccae5c" (UID: "a40ada2e-c162-4cdc-a530-e6708fccae5c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.444560 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.444592 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.444601 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.511222 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a40ada2e-c162-4cdc-a530-e6708fccae5c" (UID: "a40ada2e-c162-4cdc-a530-e6708fccae5c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.531657 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7jv2q" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.550595 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a40ada2e-c162-4cdc-a530-e6708fccae5c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.651727 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvd4x\" (UniqueName: \"kubernetes.io/projected/462eb909-9878-48ca-a8fc-93b0e6b6d7f7-kube-api-access-xvd4x\") pod \"462eb909-9878-48ca-a8fc-93b0e6b6d7f7\" (UID: \"462eb909-9878-48ca-a8fc-93b0e6b6d7f7\") " Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.651820 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/462eb909-9878-48ca-a8fc-93b0e6b6d7f7-config-data\") pod \"462eb909-9878-48ca-a8fc-93b0e6b6d7f7\" (UID: \"462eb909-9878-48ca-a8fc-93b0e6b6d7f7\") " Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.651880 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/462eb909-9878-48ca-a8fc-93b0e6b6d7f7-scripts\") pod \"462eb909-9878-48ca-a8fc-93b0e6b6d7f7\" (UID: \"462eb909-9878-48ca-a8fc-93b0e6b6d7f7\") " Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.651980 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/462eb909-9878-48ca-a8fc-93b0e6b6d7f7-combined-ca-bundle\") pod \"462eb909-9878-48ca-a8fc-93b0e6b6d7f7\" (UID: \"462eb909-9878-48ca-a8fc-93b0e6b6d7f7\") " Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.659870 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/462eb909-9878-48ca-a8fc-93b0e6b6d7f7-scripts" (OuterVolumeSpecName: "scripts") pod "462eb909-9878-48ca-a8fc-93b0e6b6d7f7" (UID: "462eb909-9878-48ca-a8fc-93b0e6b6d7f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.660008 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/462eb909-9878-48ca-a8fc-93b0e6b6d7f7-kube-api-access-xvd4x" (OuterVolumeSpecName: "kube-api-access-xvd4x") pod "462eb909-9878-48ca-a8fc-93b0e6b6d7f7" (UID: "462eb909-9878-48ca-a8fc-93b0e6b6d7f7"). InnerVolumeSpecName "kube-api-access-xvd4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.682011 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/462eb909-9878-48ca-a8fc-93b0e6b6d7f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "462eb909-9878-48ca-a8fc-93b0e6b6d7f7" (UID: "462eb909-9878-48ca-a8fc-93b0e6b6d7f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.697037 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/462eb909-9878-48ca-a8fc-93b0e6b6d7f7-config-data" (OuterVolumeSpecName: "config-data") pod "462eb909-9878-48ca-a8fc-93b0e6b6d7f7" (UID: "462eb909-9878-48ca-a8fc-93b0e6b6d7f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.753744 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/462eb909-9878-48ca-a8fc-93b0e6b6d7f7-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.753778 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/462eb909-9878-48ca-a8fc-93b0e6b6d7f7-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.753789 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/462eb909-9878-48ca-a8fc-93b0e6b6d7f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.753800 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvd4x\" (UniqueName: \"kubernetes.io/projected/462eb909-9878-48ca-a8fc-93b0e6b6d7f7-kube-api-access-xvd4x\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.959437 4760 generic.go:334] "Generic (PLEG): container finished" podID="93652508-9731-4349-b689-3d7cda972bda" containerID="50b30a83024ea4144118dad1a713f5a7170e709e15d89cbae2ff7f08087ab03a" exitCode=143 Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.959526 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"93652508-9731-4349-b689-3d7cda972bda","Type":"ContainerDied","Data":"50b30a83024ea4144118dad1a713f5a7170e709e15d89cbae2ff7f08087ab03a"} Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.962699 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.962711 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-pqzsz" event={"ID":"a40ada2e-c162-4cdc-a530-e6708fccae5c","Type":"ContainerDied","Data":"39125463dc37a32d1c1e163c1517ee202542ce0b2382d73fa449f64e0a292794"} Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.962775 4760 scope.go:117] "RemoveContainer" containerID="b531b3ff6b4f7f017c89b9fc274d05538564829cf62fdd550e88eca675b3088f" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.976378 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f0976286-8bdd-43e7-894c-3e899a323ee7" containerName="nova-scheduler-scheduler" containerID="cri-o://1d3d1f847bbdb09ebb495e4366300be15fc716ea055c4f1090d869713fd4a638" gracePeriod=30 Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.977741 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7jv2q" Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.978710 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7jv2q" event={"ID":"462eb909-9878-48ca-a8fc-93b0e6b6d7f7","Type":"ContainerDied","Data":"18246020b391b2a154a5ebe2bf559b9531bc8b274a2b19a66732ba5812233c41"} Sep 30 07:52:40 crc kubenswrapper[4760]: I0930 07:52:40.978759 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18246020b391b2a154a5ebe2bf559b9531bc8b274a2b19a66732ba5812233c41" Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.012854 4760 scope.go:117] "RemoveContainer" containerID="f5a769dfc86cc49adfc7c5088d3d6f188ba956f8fbb6f1ce51f35650c51b3d4a" Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.042613 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-pqzsz"] Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.116269 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-pqzsz"] Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.135065 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 07:52:41 crc kubenswrapper[4760]: E0930 07:52:41.135456 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="462eb909-9878-48ca-a8fc-93b0e6b6d7f7" containerName="nova-cell1-conductor-db-sync" Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.135469 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="462eb909-9878-48ca-a8fc-93b0e6b6d7f7" containerName="nova-cell1-conductor-db-sync" Sep 30 07:52:41 crc kubenswrapper[4760]: E0930 07:52:41.135481 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a40ada2e-c162-4cdc-a530-e6708fccae5c" containerName="dnsmasq-dns" Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.135486 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="a40ada2e-c162-4cdc-a530-e6708fccae5c" containerName="dnsmasq-dns" Sep 30 07:52:41 crc kubenswrapper[4760]: E0930 07:52:41.135504 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a40ada2e-c162-4cdc-a530-e6708fccae5c" containerName="init" Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.135511 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="a40ada2e-c162-4cdc-a530-e6708fccae5c" containerName="init" Sep 30 07:52:41 crc kubenswrapper[4760]: E0930 07:52:41.135520 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c34c9e7c-e366-4179-ab98-c01d5b2cfc3d" containerName="nova-manage" Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.135526 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="c34c9e7c-e366-4179-ab98-c01d5b2cfc3d" containerName="nova-manage" Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.138243 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="462eb909-9878-48ca-a8fc-93b0e6b6d7f7" containerName="nova-cell1-conductor-db-sync" Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.138276 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="a40ada2e-c162-4cdc-a530-e6708fccae5c" containerName="dnsmasq-dns" Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.138294 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="c34c9e7c-e366-4179-ab98-c01d5b2cfc3d" containerName="nova-manage" Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.138974 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.141378 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.157367 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.188283 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.188515 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="23bad0af-c21e-4ba1-bc39-39c48f0fea56" containerName="kube-state-metrics" containerID="cri-o://9c0afca8ae3aeebf6add76b7fc525a9757fa0cf9f077654066903473a02b7265" gracePeriod=30 Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.263694 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk7bm\" (UniqueName: \"kubernetes.io/projected/998c9109-185a-425a-bda1-12eb13c83ca7-kube-api-access-bk7bm\") pod \"nova-cell1-conductor-0\" (UID: \"998c9109-185a-425a-bda1-12eb13c83ca7\") " pod="openstack/nova-cell1-conductor-0" Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.263745 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/998c9109-185a-425a-bda1-12eb13c83ca7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"998c9109-185a-425a-bda1-12eb13c83ca7\") " pod="openstack/nova-cell1-conductor-0" Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.264151 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/998c9109-185a-425a-bda1-12eb13c83ca7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"998c9109-185a-425a-bda1-12eb13c83ca7\") " pod="openstack/nova-cell1-conductor-0" Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.365211 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/998c9109-185a-425a-bda1-12eb13c83ca7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"998c9109-185a-425a-bda1-12eb13c83ca7\") " pod="openstack/nova-cell1-conductor-0" Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.365529 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk7bm\" (UniqueName: \"kubernetes.io/projected/998c9109-185a-425a-bda1-12eb13c83ca7-kube-api-access-bk7bm\") pod \"nova-cell1-conductor-0\" (UID: \"998c9109-185a-425a-bda1-12eb13c83ca7\") " pod="openstack/nova-cell1-conductor-0" Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.365561 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/998c9109-185a-425a-bda1-12eb13c83ca7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"998c9109-185a-425a-bda1-12eb13c83ca7\") " pod="openstack/nova-cell1-conductor-0" Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.371951 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/998c9109-185a-425a-bda1-12eb13c83ca7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"998c9109-185a-425a-bda1-12eb13c83ca7\") " pod="openstack/nova-cell1-conductor-0" Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.371974 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/998c9109-185a-425a-bda1-12eb13c83ca7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"998c9109-185a-425a-bda1-12eb13c83ca7\") " pod="openstack/nova-cell1-conductor-0" Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.383193 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk7bm\" (UniqueName: \"kubernetes.io/projected/998c9109-185a-425a-bda1-12eb13c83ca7-kube-api-access-bk7bm\") pod \"nova-cell1-conductor-0\" (UID: \"998c9109-185a-425a-bda1-12eb13c83ca7\") " pod="openstack/nova-cell1-conductor-0" Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.476912 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.619358 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.772593 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxnfd\" (UniqueName: \"kubernetes.io/projected/23bad0af-c21e-4ba1-bc39-39c48f0fea56-kube-api-access-mxnfd\") pod \"23bad0af-c21e-4ba1-bc39-39c48f0fea56\" (UID: \"23bad0af-c21e-4ba1-bc39-39c48f0fea56\") " Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.787242 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23bad0af-c21e-4ba1-bc39-39c48f0fea56-kube-api-access-mxnfd" (OuterVolumeSpecName: "kube-api-access-mxnfd") pod "23bad0af-c21e-4ba1-bc39-39c48f0fea56" (UID: "23bad0af-c21e-4ba1-bc39-39c48f0fea56"). InnerVolumeSpecName "kube-api-access-mxnfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.875828 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxnfd\" (UniqueName: \"kubernetes.io/projected/23bad0af-c21e-4ba1-bc39-39c48f0fea56-kube-api-access-mxnfd\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.967525 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.992843 4760 generic.go:334] "Generic (PLEG): container finished" podID="23bad0af-c21e-4ba1-bc39-39c48f0fea56" containerID="9c0afca8ae3aeebf6add76b7fc525a9757fa0cf9f077654066903473a02b7265" exitCode=2 Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.992907 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"23bad0af-c21e-4ba1-bc39-39c48f0fea56","Type":"ContainerDied","Data":"9c0afca8ae3aeebf6add76b7fc525a9757fa0cf9f077654066903473a02b7265"} Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.992932 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"23bad0af-c21e-4ba1-bc39-39c48f0fea56","Type":"ContainerDied","Data":"a7978639ab6b98dc70f658a0441f8e021e96ea6fab411af0b2ccce15d232ff62"} Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.992947 4760 scope.go:117] "RemoveContainer" containerID="9c0afca8ae3aeebf6add76b7fc525a9757fa0cf9f077654066903473a02b7265" Sep 30 07:52:41 crc kubenswrapper[4760]: I0930 07:52:41.993028 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 07:52:42 crc kubenswrapper[4760]: I0930 07:52:42.003785 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"998c9109-185a-425a-bda1-12eb13c83ca7","Type":"ContainerStarted","Data":"d070b88c8b406eff7b431cb98671c6841dcf9ba78fcf317315478123b12716b2"} Sep 30 07:52:42 crc kubenswrapper[4760]: I0930 07:52:42.030699 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 07:52:42 crc kubenswrapper[4760]: I0930 07:52:42.032696 4760 scope.go:117] "RemoveContainer" containerID="9c0afca8ae3aeebf6add76b7fc525a9757fa0cf9f077654066903473a02b7265" Sep 30 07:52:42 crc kubenswrapper[4760]: E0930 07:52:42.034933 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c0afca8ae3aeebf6add76b7fc525a9757fa0cf9f077654066903473a02b7265\": container with ID starting with 9c0afca8ae3aeebf6add76b7fc525a9757fa0cf9f077654066903473a02b7265 not found: ID does not exist" containerID="9c0afca8ae3aeebf6add76b7fc525a9757fa0cf9f077654066903473a02b7265" Sep 30 07:52:42 crc kubenswrapper[4760]: I0930 07:52:42.034978 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c0afca8ae3aeebf6add76b7fc525a9757fa0cf9f077654066903473a02b7265"} err="failed to get container status \"9c0afca8ae3aeebf6add76b7fc525a9757fa0cf9f077654066903473a02b7265\": rpc error: code = NotFound desc = could not find container \"9c0afca8ae3aeebf6add76b7fc525a9757fa0cf9f077654066903473a02b7265\": container with ID starting with 9c0afca8ae3aeebf6add76b7fc525a9757fa0cf9f077654066903473a02b7265 not found: ID does not exist" Sep 30 07:52:42 crc kubenswrapper[4760]: I0930 07:52:42.044729 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 07:52:42 crc kubenswrapper[4760]: I0930 07:52:42.079198 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 07:52:42 crc kubenswrapper[4760]: E0930 07:52:42.080935 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23bad0af-c21e-4ba1-bc39-39c48f0fea56" containerName="kube-state-metrics" Sep 30 07:52:42 crc kubenswrapper[4760]: I0930 07:52:42.080958 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="23bad0af-c21e-4ba1-bc39-39c48f0fea56" containerName="kube-state-metrics" Sep 30 07:52:42 crc kubenswrapper[4760]: I0930 07:52:42.081329 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="23bad0af-c21e-4ba1-bc39-39c48f0fea56" containerName="kube-state-metrics" Sep 30 07:52:42 crc kubenswrapper[4760]: I0930 07:52:42.087147 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 07:52:42 crc kubenswrapper[4760]: I0930 07:52:42.090504 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Sep 30 07:52:42 crc kubenswrapper[4760]: I0930 07:52:42.090677 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Sep 30 07:52:42 crc kubenswrapper[4760]: I0930 07:52:42.110890 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 07:52:42 crc kubenswrapper[4760]: I0930 07:52:42.183727 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4fc405db-b61d-4077-b14d-ef2b4eea924c-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4fc405db-b61d-4077-b14d-ef2b4eea924c\") " pod="openstack/kube-state-metrics-0" Sep 30 07:52:42 crc kubenswrapper[4760]: I0930 07:52:42.183831 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fc405db-b61d-4077-b14d-ef2b4eea924c-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4fc405db-b61d-4077-b14d-ef2b4eea924c\") " pod="openstack/kube-state-metrics-0" Sep 30 07:52:42 crc kubenswrapper[4760]: I0930 07:52:42.184000 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc405db-b61d-4077-b14d-ef2b4eea924c-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4fc405db-b61d-4077-b14d-ef2b4eea924c\") " pod="openstack/kube-state-metrics-0" Sep 30 07:52:42 crc kubenswrapper[4760]: I0930 07:52:42.184050 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxz2j\" (UniqueName: \"kubernetes.io/projected/4fc405db-b61d-4077-b14d-ef2b4eea924c-kube-api-access-qxz2j\") pod \"kube-state-metrics-0\" (UID: \"4fc405db-b61d-4077-b14d-ef2b4eea924c\") " pod="openstack/kube-state-metrics-0" Sep 30 07:52:42 crc kubenswrapper[4760]: I0930 07:52:42.285852 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4fc405db-b61d-4077-b14d-ef2b4eea924c-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4fc405db-b61d-4077-b14d-ef2b4eea924c\") " pod="openstack/kube-state-metrics-0" Sep 30 07:52:42 crc kubenswrapper[4760]: I0930 07:52:42.285927 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fc405db-b61d-4077-b14d-ef2b4eea924c-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4fc405db-b61d-4077-b14d-ef2b4eea924c\") " pod="openstack/kube-state-metrics-0" Sep 30 07:52:42 crc kubenswrapper[4760]: I0930 07:52:42.286010 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc405db-b61d-4077-b14d-ef2b4eea924c-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4fc405db-b61d-4077-b14d-ef2b4eea924c\") " pod="openstack/kube-state-metrics-0" Sep 30 07:52:42 crc kubenswrapper[4760]: I0930 07:52:42.286044 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxz2j\" (UniqueName: \"kubernetes.io/projected/4fc405db-b61d-4077-b14d-ef2b4eea924c-kube-api-access-qxz2j\") pod \"kube-state-metrics-0\" (UID: \"4fc405db-b61d-4077-b14d-ef2b4eea924c\") " pod="openstack/kube-state-metrics-0" Sep 30 07:52:42 crc kubenswrapper[4760]: I0930 07:52:42.291871 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4fc405db-b61d-4077-b14d-ef2b4eea924c-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4fc405db-b61d-4077-b14d-ef2b4eea924c\") " pod="openstack/kube-state-metrics-0" Sep 30 07:52:42 crc kubenswrapper[4760]: I0930 07:52:42.292671 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc405db-b61d-4077-b14d-ef2b4eea924c-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4fc405db-b61d-4077-b14d-ef2b4eea924c\") " pod="openstack/kube-state-metrics-0" Sep 30 07:52:42 crc kubenswrapper[4760]: I0930 07:52:42.294870 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fc405db-b61d-4077-b14d-ef2b4eea924c-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4fc405db-b61d-4077-b14d-ef2b4eea924c\") " pod="openstack/kube-state-metrics-0" Sep 30 07:52:42 crc kubenswrapper[4760]: I0930 07:52:42.314940 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxz2j\" (UniqueName: \"kubernetes.io/projected/4fc405db-b61d-4077-b14d-ef2b4eea924c-kube-api-access-qxz2j\") pod \"kube-state-metrics-0\" (UID: \"4fc405db-b61d-4077-b14d-ef2b4eea924c\") " pod="openstack/kube-state-metrics-0" Sep 30 07:52:42 crc kubenswrapper[4760]: I0930 07:52:42.429159 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 07:52:43 crc kubenswrapper[4760]: I0930 07:52:43.016235 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"998c9109-185a-425a-bda1-12eb13c83ca7","Type":"ContainerStarted","Data":"f985e80517117c5426a5f1c530e5dd64e051804024d35492b47562fb675ae2b8"} Sep 30 07:52:43 crc kubenswrapper[4760]: I0930 07:52:43.016700 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Sep 30 07:52:43 crc kubenswrapper[4760]: I0930 07:52:43.035404 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 07:52:43 crc kubenswrapper[4760]: I0930 07:52:43.051843 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.051823721 podStartE2EDuration="2.051823721s" podCreationTimestamp="2025-09-30 07:52:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:52:43.045848669 +0000 UTC m=+1148.688755121" watchObservedRunningTime="2025-09-30 07:52:43.051823721 +0000 UTC m=+1148.694730133" Sep 30 07:52:43 crc kubenswrapper[4760]: I0930 07:52:43.080525 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23bad0af-c21e-4ba1-bc39-39c48f0fea56" path="/var/lib/kubelet/pods/23bad0af-c21e-4ba1-bc39-39c48f0fea56/volumes" Sep 30 07:52:43 crc kubenswrapper[4760]: I0930 07:52:43.081264 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a40ada2e-c162-4cdc-a530-e6708fccae5c" path="/var/lib/kubelet/pods/a40ada2e-c162-4cdc-a530-e6708fccae5c/volumes" Sep 30 07:52:43 crc kubenswrapper[4760]: I0930 07:52:43.767739 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:52:43 crc kubenswrapper[4760]: I0930 07:52:43.768833 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eaa10928-35f2-46a8-82e2-b6569a81187d" containerName="sg-core" containerID="cri-o://72d895c026a810a814b8f0274f0762c27f77f307451a38e805851d97c6a3b4cb" gracePeriod=30 Sep 30 07:52:43 crc kubenswrapper[4760]: I0930 07:52:43.768885 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eaa10928-35f2-46a8-82e2-b6569a81187d" containerName="ceilometer-notification-agent" containerID="cri-o://8ad2037f68588817853afa08aec9a4e75fd5f1270a2fe780394aa365a27b5a75" gracePeriod=30 Sep 30 07:52:43 crc kubenswrapper[4760]: I0930 07:52:43.768920 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eaa10928-35f2-46a8-82e2-b6569a81187d" containerName="proxy-httpd" containerID="cri-o://1773ca8a16bacd865cd3b9aadd12895a310b3ea1cef9d4618ee501f943a6d75c" gracePeriod=30 Sep 30 07:52:43 crc kubenswrapper[4760]: I0930 07:52:43.768785 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eaa10928-35f2-46a8-82e2-b6569a81187d" containerName="ceilometer-central-agent" containerID="cri-o://0b7daf12868ebf9d853e0914a69d76d36dd87684c2dd01f4ab43ba3d656af8f8" gracePeriod=30 Sep 30 07:52:43 crc kubenswrapper[4760]: E0930 07:52:43.897422 4760 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1d3d1f847bbdb09ebb495e4366300be15fc716ea055c4f1090d869713fd4a638" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 07:52:43 crc kubenswrapper[4760]: E0930 07:52:43.900891 4760 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1d3d1f847bbdb09ebb495e4366300be15fc716ea055c4f1090d869713fd4a638" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 07:52:43 crc kubenswrapper[4760]: E0930 07:52:43.902428 4760 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1d3d1f847bbdb09ebb495e4366300be15fc716ea055c4f1090d869713fd4a638" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 07:52:43 crc kubenswrapper[4760]: E0930 07:52:43.902488 4760 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f0976286-8bdd-43e7-894c-3e899a323ee7" containerName="nova-scheduler-scheduler" Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.028360 4760 generic.go:334] "Generic (PLEG): container finished" podID="eaa10928-35f2-46a8-82e2-b6569a81187d" containerID="1773ca8a16bacd865cd3b9aadd12895a310b3ea1cef9d4618ee501f943a6d75c" exitCode=0 Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.029476 4760 generic.go:334] "Generic (PLEG): container finished" podID="eaa10928-35f2-46a8-82e2-b6569a81187d" containerID="72d895c026a810a814b8f0274f0762c27f77f307451a38e805851d97c6a3b4cb" exitCode=2 Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.028440 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaa10928-35f2-46a8-82e2-b6569a81187d","Type":"ContainerDied","Data":"1773ca8a16bacd865cd3b9aadd12895a310b3ea1cef9d4618ee501f943a6d75c"} Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.029716 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaa10928-35f2-46a8-82e2-b6569a81187d","Type":"ContainerDied","Data":"72d895c026a810a814b8f0274f0762c27f77f307451a38e805851d97c6a3b4cb"} Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.032077 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4fc405db-b61d-4077-b14d-ef2b4eea924c","Type":"ContainerStarted","Data":"93be6cfdb8bde317ae9582ed0ce47b73ab1ceee50f5d91ed037ce9a99a2351c1"} Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.032138 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4fc405db-b61d-4077-b14d-ef2b4eea924c","Type":"ContainerStarted","Data":"7f7e8004cad337794b4dd15dd648bc3e4103998a53ae0d9613bbf6791120b481"} Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.032402 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.058192 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.710198431 podStartE2EDuration="2.05817231s" podCreationTimestamp="2025-09-30 07:52:42 +0000 UTC" firstStartedPulling="2025-09-30 07:52:43.038986574 +0000 UTC m=+1148.681892986" lastFinishedPulling="2025-09-30 07:52:43.386960423 +0000 UTC m=+1149.029866865" observedRunningTime="2025-09-30 07:52:44.047675672 +0000 UTC m=+1149.690582084" watchObservedRunningTime="2025-09-30 07:52:44.05817231 +0000 UTC m=+1149.701078722" Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.704138 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.852631 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaa10928-35f2-46a8-82e2-b6569a81187d-scripts\") pod \"eaa10928-35f2-46a8-82e2-b6569a81187d\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.852773 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaa10928-35f2-46a8-82e2-b6569a81187d-log-httpd\") pod \"eaa10928-35f2-46a8-82e2-b6569a81187d\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.852856 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaa10928-35f2-46a8-82e2-b6569a81187d-run-httpd\") pod \"eaa10928-35f2-46a8-82e2-b6569a81187d\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.852920 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck9pm\" (UniqueName: \"kubernetes.io/projected/eaa10928-35f2-46a8-82e2-b6569a81187d-kube-api-access-ck9pm\") pod \"eaa10928-35f2-46a8-82e2-b6569a81187d\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.852969 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaa10928-35f2-46a8-82e2-b6569a81187d-config-data\") pod \"eaa10928-35f2-46a8-82e2-b6569a81187d\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.853009 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaa10928-35f2-46a8-82e2-b6569a81187d-combined-ca-bundle\") pod \"eaa10928-35f2-46a8-82e2-b6569a81187d\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.853062 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eaa10928-35f2-46a8-82e2-b6569a81187d-sg-core-conf-yaml\") pod \"eaa10928-35f2-46a8-82e2-b6569a81187d\" (UID: \"eaa10928-35f2-46a8-82e2-b6569a81187d\") " Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.853277 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaa10928-35f2-46a8-82e2-b6569a81187d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "eaa10928-35f2-46a8-82e2-b6569a81187d" (UID: "eaa10928-35f2-46a8-82e2-b6569a81187d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.853334 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaa10928-35f2-46a8-82e2-b6569a81187d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "eaa10928-35f2-46a8-82e2-b6569a81187d" (UID: "eaa10928-35f2-46a8-82e2-b6569a81187d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.853888 4760 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaa10928-35f2-46a8-82e2-b6569a81187d-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.853907 4760 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaa10928-35f2-46a8-82e2-b6569a81187d-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.859280 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaa10928-35f2-46a8-82e2-b6569a81187d-scripts" (OuterVolumeSpecName: "scripts") pod "eaa10928-35f2-46a8-82e2-b6569a81187d" (UID: "eaa10928-35f2-46a8-82e2-b6569a81187d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.876800 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaa10928-35f2-46a8-82e2-b6569a81187d-kube-api-access-ck9pm" (OuterVolumeSpecName: "kube-api-access-ck9pm") pod "eaa10928-35f2-46a8-82e2-b6569a81187d" (UID: "eaa10928-35f2-46a8-82e2-b6569a81187d"). InnerVolumeSpecName "kube-api-access-ck9pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.899499 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaa10928-35f2-46a8-82e2-b6569a81187d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "eaa10928-35f2-46a8-82e2-b6569a81187d" (UID: "eaa10928-35f2-46a8-82e2-b6569a81187d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.955963 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaa10928-35f2-46a8-82e2-b6569a81187d-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.956003 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ck9pm\" (UniqueName: \"kubernetes.io/projected/eaa10928-35f2-46a8-82e2-b6569a81187d-kube-api-access-ck9pm\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.956013 4760 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eaa10928-35f2-46a8-82e2-b6569a81187d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.974017 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaa10928-35f2-46a8-82e2-b6569a81187d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eaa10928-35f2-46a8-82e2-b6569a81187d" (UID: "eaa10928-35f2-46a8-82e2-b6569a81187d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:44 crc kubenswrapper[4760]: I0930 07:52:44.984442 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaa10928-35f2-46a8-82e2-b6569a81187d-config-data" (OuterVolumeSpecName: "config-data") pod "eaa10928-35f2-46a8-82e2-b6569a81187d" (UID: "eaa10928-35f2-46a8-82e2-b6569a81187d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.054640 4760 generic.go:334] "Generic (PLEG): container finished" podID="eaa10928-35f2-46a8-82e2-b6569a81187d" containerID="8ad2037f68588817853afa08aec9a4e75fd5f1270a2fe780394aa365a27b5a75" exitCode=0 Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.054668 4760 generic.go:334] "Generic (PLEG): container finished" podID="eaa10928-35f2-46a8-82e2-b6569a81187d" containerID="0b7daf12868ebf9d853e0914a69d76d36dd87684c2dd01f4ab43ba3d656af8f8" exitCode=0 Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.054698 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.054750 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaa10928-35f2-46a8-82e2-b6569a81187d","Type":"ContainerDied","Data":"8ad2037f68588817853afa08aec9a4e75fd5f1270a2fe780394aa365a27b5a75"} Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.054818 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaa10928-35f2-46a8-82e2-b6569a81187d","Type":"ContainerDied","Data":"0b7daf12868ebf9d853e0914a69d76d36dd87684c2dd01f4ab43ba3d656af8f8"} Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.054831 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaa10928-35f2-46a8-82e2-b6569a81187d","Type":"ContainerDied","Data":"79acad6e9216cec8637889a3750e606d0ecb648ab63484ed37662c246f1169d5"} Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.054852 4760 scope.go:117] "RemoveContainer" containerID="1773ca8a16bacd865cd3b9aadd12895a310b3ea1cef9d4618ee501f943a6d75c" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.058363 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaa10928-35f2-46a8-82e2-b6569a81187d-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.058426 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaa10928-35f2-46a8-82e2-b6569a81187d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.101403 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.117968 4760 scope.go:117] "RemoveContainer" containerID="72d895c026a810a814b8f0274f0762c27f77f307451a38e805851d97c6a3b4cb" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.118071 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.134763 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:52:45 crc kubenswrapper[4760]: E0930 07:52:45.135775 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaa10928-35f2-46a8-82e2-b6569a81187d" containerName="ceilometer-notification-agent" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.135792 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaa10928-35f2-46a8-82e2-b6569a81187d" containerName="ceilometer-notification-agent" Sep 30 07:52:45 crc kubenswrapper[4760]: E0930 07:52:45.135832 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaa10928-35f2-46a8-82e2-b6569a81187d" containerName="ceilometer-central-agent" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.135840 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaa10928-35f2-46a8-82e2-b6569a81187d" containerName="ceilometer-central-agent" Sep 30 07:52:45 crc kubenswrapper[4760]: E0930 07:52:45.135856 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaa10928-35f2-46a8-82e2-b6569a81187d" containerName="proxy-httpd" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.135862 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaa10928-35f2-46a8-82e2-b6569a81187d" containerName="proxy-httpd" Sep 30 07:52:45 crc kubenswrapper[4760]: E0930 07:52:45.135879 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaa10928-35f2-46a8-82e2-b6569a81187d" containerName="sg-core" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.135885 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaa10928-35f2-46a8-82e2-b6569a81187d" containerName="sg-core" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.137670 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaa10928-35f2-46a8-82e2-b6569a81187d" containerName="proxy-httpd" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.137701 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaa10928-35f2-46a8-82e2-b6569a81187d" containerName="ceilometer-notification-agent" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.137711 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaa10928-35f2-46a8-82e2-b6569a81187d" containerName="sg-core" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.137736 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaa10928-35f2-46a8-82e2-b6569a81187d" containerName="ceilometer-central-agent" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.152142 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.156073 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.159652 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.160097 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.168758 4760 scope.go:117] "RemoveContainer" containerID="8ad2037f68588817853afa08aec9a4e75fd5f1270a2fe780394aa365a27b5a75" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.189382 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.263517 4760 scope.go:117] "RemoveContainer" containerID="0b7daf12868ebf9d853e0914a69d76d36dd87684c2dd01f4ab43ba3d656af8f8" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.264647 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " pod="openstack/ceilometer-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.264679 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " pod="openstack/ceilometer-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.264722 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " pod="openstack/ceilometer-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.264740 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ce11abc-b049-4106-b431-7cb6d620a2f3-log-httpd\") pod \"ceilometer-0\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " pod="openstack/ceilometer-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.264780 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-scripts\") pod \"ceilometer-0\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " pod="openstack/ceilometer-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.264815 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ce11abc-b049-4106-b431-7cb6d620a2f3-run-httpd\") pod \"ceilometer-0\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " pod="openstack/ceilometer-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.264888 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqfg8\" (UniqueName: \"kubernetes.io/projected/0ce11abc-b049-4106-b431-7cb6d620a2f3-kube-api-access-jqfg8\") pod \"ceilometer-0\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " pod="openstack/ceilometer-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.264902 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-config-data\") pod \"ceilometer-0\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " pod="openstack/ceilometer-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.297349 4760 scope.go:117] "RemoveContainer" containerID="1773ca8a16bacd865cd3b9aadd12895a310b3ea1cef9d4618ee501f943a6d75c" Sep 30 07:52:45 crc kubenswrapper[4760]: E0930 07:52:45.301812 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1773ca8a16bacd865cd3b9aadd12895a310b3ea1cef9d4618ee501f943a6d75c\": container with ID starting with 1773ca8a16bacd865cd3b9aadd12895a310b3ea1cef9d4618ee501f943a6d75c not found: ID does not exist" containerID="1773ca8a16bacd865cd3b9aadd12895a310b3ea1cef9d4618ee501f943a6d75c" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.301854 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1773ca8a16bacd865cd3b9aadd12895a310b3ea1cef9d4618ee501f943a6d75c"} err="failed to get container status \"1773ca8a16bacd865cd3b9aadd12895a310b3ea1cef9d4618ee501f943a6d75c\": rpc error: code = NotFound desc = could not find container \"1773ca8a16bacd865cd3b9aadd12895a310b3ea1cef9d4618ee501f943a6d75c\": container with ID starting with 1773ca8a16bacd865cd3b9aadd12895a310b3ea1cef9d4618ee501f943a6d75c not found: ID does not exist" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.301879 4760 scope.go:117] "RemoveContainer" containerID="72d895c026a810a814b8f0274f0762c27f77f307451a38e805851d97c6a3b4cb" Sep 30 07:52:45 crc kubenswrapper[4760]: E0930 07:52:45.302324 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72d895c026a810a814b8f0274f0762c27f77f307451a38e805851d97c6a3b4cb\": container with ID starting with 72d895c026a810a814b8f0274f0762c27f77f307451a38e805851d97c6a3b4cb not found: ID does not exist" containerID="72d895c026a810a814b8f0274f0762c27f77f307451a38e805851d97c6a3b4cb" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.302341 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72d895c026a810a814b8f0274f0762c27f77f307451a38e805851d97c6a3b4cb"} err="failed to get container status \"72d895c026a810a814b8f0274f0762c27f77f307451a38e805851d97c6a3b4cb\": rpc error: code = NotFound desc = could not find container \"72d895c026a810a814b8f0274f0762c27f77f307451a38e805851d97c6a3b4cb\": container with ID starting with 72d895c026a810a814b8f0274f0762c27f77f307451a38e805851d97c6a3b4cb not found: ID does not exist" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.302352 4760 scope.go:117] "RemoveContainer" containerID="8ad2037f68588817853afa08aec9a4e75fd5f1270a2fe780394aa365a27b5a75" Sep 30 07:52:45 crc kubenswrapper[4760]: E0930 07:52:45.302590 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ad2037f68588817853afa08aec9a4e75fd5f1270a2fe780394aa365a27b5a75\": container with ID starting with 8ad2037f68588817853afa08aec9a4e75fd5f1270a2fe780394aa365a27b5a75 not found: ID does not exist" containerID="8ad2037f68588817853afa08aec9a4e75fd5f1270a2fe780394aa365a27b5a75" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.302605 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ad2037f68588817853afa08aec9a4e75fd5f1270a2fe780394aa365a27b5a75"} err="failed to get container status \"8ad2037f68588817853afa08aec9a4e75fd5f1270a2fe780394aa365a27b5a75\": rpc error: code = NotFound desc = could not find container \"8ad2037f68588817853afa08aec9a4e75fd5f1270a2fe780394aa365a27b5a75\": container with ID starting with 8ad2037f68588817853afa08aec9a4e75fd5f1270a2fe780394aa365a27b5a75 not found: ID does not exist" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.302617 4760 scope.go:117] "RemoveContainer" containerID="0b7daf12868ebf9d853e0914a69d76d36dd87684c2dd01f4ab43ba3d656af8f8" Sep 30 07:52:45 crc kubenswrapper[4760]: E0930 07:52:45.302821 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b7daf12868ebf9d853e0914a69d76d36dd87684c2dd01f4ab43ba3d656af8f8\": container with ID starting with 0b7daf12868ebf9d853e0914a69d76d36dd87684c2dd01f4ab43ba3d656af8f8 not found: ID does not exist" containerID="0b7daf12868ebf9d853e0914a69d76d36dd87684c2dd01f4ab43ba3d656af8f8" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.302852 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b7daf12868ebf9d853e0914a69d76d36dd87684c2dd01f4ab43ba3d656af8f8"} err="failed to get container status \"0b7daf12868ebf9d853e0914a69d76d36dd87684c2dd01f4ab43ba3d656af8f8\": rpc error: code = NotFound desc = could not find container \"0b7daf12868ebf9d853e0914a69d76d36dd87684c2dd01f4ab43ba3d656af8f8\": container with ID starting with 0b7daf12868ebf9d853e0914a69d76d36dd87684c2dd01f4ab43ba3d656af8f8 not found: ID does not exist" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.302864 4760 scope.go:117] "RemoveContainer" containerID="1773ca8a16bacd865cd3b9aadd12895a310b3ea1cef9d4618ee501f943a6d75c" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.303098 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1773ca8a16bacd865cd3b9aadd12895a310b3ea1cef9d4618ee501f943a6d75c"} err="failed to get container status \"1773ca8a16bacd865cd3b9aadd12895a310b3ea1cef9d4618ee501f943a6d75c\": rpc error: code = NotFound desc = could not find container \"1773ca8a16bacd865cd3b9aadd12895a310b3ea1cef9d4618ee501f943a6d75c\": container with ID starting with 1773ca8a16bacd865cd3b9aadd12895a310b3ea1cef9d4618ee501f943a6d75c not found: ID does not exist" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.303110 4760 scope.go:117] "RemoveContainer" containerID="72d895c026a810a814b8f0274f0762c27f77f307451a38e805851d97c6a3b4cb" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.303411 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72d895c026a810a814b8f0274f0762c27f77f307451a38e805851d97c6a3b4cb"} err="failed to get container status \"72d895c026a810a814b8f0274f0762c27f77f307451a38e805851d97c6a3b4cb\": rpc error: code = NotFound desc = could not find container \"72d895c026a810a814b8f0274f0762c27f77f307451a38e805851d97c6a3b4cb\": container with ID starting with 72d895c026a810a814b8f0274f0762c27f77f307451a38e805851d97c6a3b4cb not found: ID does not exist" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.303424 4760 scope.go:117] "RemoveContainer" containerID="8ad2037f68588817853afa08aec9a4e75fd5f1270a2fe780394aa365a27b5a75" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.303603 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ad2037f68588817853afa08aec9a4e75fd5f1270a2fe780394aa365a27b5a75"} err="failed to get container status \"8ad2037f68588817853afa08aec9a4e75fd5f1270a2fe780394aa365a27b5a75\": rpc error: code = NotFound desc = could not find container \"8ad2037f68588817853afa08aec9a4e75fd5f1270a2fe780394aa365a27b5a75\": container with ID starting with 8ad2037f68588817853afa08aec9a4e75fd5f1270a2fe780394aa365a27b5a75 not found: ID does not exist" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.303615 4760 scope.go:117] "RemoveContainer" containerID="0b7daf12868ebf9d853e0914a69d76d36dd87684c2dd01f4ab43ba3d656af8f8" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.303836 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b7daf12868ebf9d853e0914a69d76d36dd87684c2dd01f4ab43ba3d656af8f8"} err="failed to get container status \"0b7daf12868ebf9d853e0914a69d76d36dd87684c2dd01f4ab43ba3d656af8f8\": rpc error: code = NotFound desc = could not find container \"0b7daf12868ebf9d853e0914a69d76d36dd87684c2dd01f4ab43ba3d656af8f8\": container with ID starting with 0b7daf12868ebf9d853e0914a69d76d36dd87684c2dd01f4ab43ba3d656af8f8 not found: ID does not exist" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.366739 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " pod="openstack/ceilometer-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.366775 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ce11abc-b049-4106-b431-7cb6d620a2f3-log-httpd\") pod \"ceilometer-0\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " pod="openstack/ceilometer-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.366822 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-scripts\") pod \"ceilometer-0\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " pod="openstack/ceilometer-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.366859 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ce11abc-b049-4106-b431-7cb6d620a2f3-run-httpd\") pod \"ceilometer-0\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " pod="openstack/ceilometer-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.366936 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-config-data\") pod \"ceilometer-0\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " pod="openstack/ceilometer-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.366954 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqfg8\" (UniqueName: \"kubernetes.io/projected/0ce11abc-b049-4106-b431-7cb6d620a2f3-kube-api-access-jqfg8\") pod \"ceilometer-0\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " pod="openstack/ceilometer-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.366981 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " pod="openstack/ceilometer-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.366997 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " pod="openstack/ceilometer-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.367502 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ce11abc-b049-4106-b431-7cb6d620a2f3-log-httpd\") pod \"ceilometer-0\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " pod="openstack/ceilometer-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.367600 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ce11abc-b049-4106-b431-7cb6d620a2f3-run-httpd\") pod \"ceilometer-0\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " pod="openstack/ceilometer-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.372337 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " pod="openstack/ceilometer-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.373238 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " pod="openstack/ceilometer-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.384229 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " pod="openstack/ceilometer-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.384246 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-scripts\") pod \"ceilometer-0\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " pod="openstack/ceilometer-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.384871 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-config-data\") pod \"ceilometer-0\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " pod="openstack/ceilometer-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.386951 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqfg8\" (UniqueName: \"kubernetes.io/projected/0ce11abc-b049-4106-b431-7cb6d620a2f3-kube-api-access-jqfg8\") pod \"ceilometer-0\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " pod="openstack/ceilometer-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.556888 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.687251 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.774834 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5n2v\" (UniqueName: \"kubernetes.io/projected/f0976286-8bdd-43e7-894c-3e899a323ee7-kube-api-access-n5n2v\") pod \"f0976286-8bdd-43e7-894c-3e899a323ee7\" (UID: \"f0976286-8bdd-43e7-894c-3e899a323ee7\") " Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.775384 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0976286-8bdd-43e7-894c-3e899a323ee7-combined-ca-bundle\") pod \"f0976286-8bdd-43e7-894c-3e899a323ee7\" (UID: \"f0976286-8bdd-43e7-894c-3e899a323ee7\") " Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.775419 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0976286-8bdd-43e7-894c-3e899a323ee7-config-data\") pod \"f0976286-8bdd-43e7-894c-3e899a323ee7\" (UID: \"f0976286-8bdd-43e7-894c-3e899a323ee7\") " Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.816599 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0976286-8bdd-43e7-894c-3e899a323ee7-kube-api-access-n5n2v" (OuterVolumeSpecName: "kube-api-access-n5n2v") pod "f0976286-8bdd-43e7-894c-3e899a323ee7" (UID: "f0976286-8bdd-43e7-894c-3e899a323ee7"). InnerVolumeSpecName "kube-api-access-n5n2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.840659 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0976286-8bdd-43e7-894c-3e899a323ee7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0976286-8bdd-43e7-894c-3e899a323ee7" (UID: "f0976286-8bdd-43e7-894c-3e899a323ee7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.862351 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0976286-8bdd-43e7-894c-3e899a323ee7-config-data" (OuterVolumeSpecName: "config-data") pod "f0976286-8bdd-43e7-894c-3e899a323ee7" (UID: "f0976286-8bdd-43e7-894c-3e899a323ee7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.878134 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0976286-8bdd-43e7-894c-3e899a323ee7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.878161 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0976286-8bdd-43e7-894c-3e899a323ee7-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:45 crc kubenswrapper[4760]: I0930 07:52:45.878172 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5n2v\" (UniqueName: \"kubernetes.io/projected/f0976286-8bdd-43e7-894c-3e899a323ee7-kube-api-access-n5n2v\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:46 crc kubenswrapper[4760]: I0930 07:52:46.068533 4760 generic.go:334] "Generic (PLEG): container finished" podID="f0976286-8bdd-43e7-894c-3e899a323ee7" containerID="1d3d1f847bbdb09ebb495e4366300be15fc716ea055c4f1090d869713fd4a638" exitCode=0 Sep 30 07:52:46 crc kubenswrapper[4760]: I0930 07:52:46.068575 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f0976286-8bdd-43e7-894c-3e899a323ee7","Type":"ContainerDied","Data":"1d3d1f847bbdb09ebb495e4366300be15fc716ea055c4f1090d869713fd4a638"} Sep 30 07:52:46 crc kubenswrapper[4760]: I0930 07:52:46.068604 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f0976286-8bdd-43e7-894c-3e899a323ee7","Type":"ContainerDied","Data":"bf39715a3c6ae59463830e381824b5489226d4113d86548f053ed6cc39c6f69d"} Sep 30 07:52:46 crc kubenswrapper[4760]: I0930 07:52:46.068626 4760 scope.go:117] "RemoveContainer" containerID="1d3d1f847bbdb09ebb495e4366300be15fc716ea055c4f1090d869713fd4a638" Sep 30 07:52:46 crc kubenswrapper[4760]: I0930 07:52:46.068686 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 07:52:46 crc kubenswrapper[4760]: I0930 07:52:46.094692 4760 scope.go:117] "RemoveContainer" containerID="1d3d1f847bbdb09ebb495e4366300be15fc716ea055c4f1090d869713fd4a638" Sep 30 07:52:46 crc kubenswrapper[4760]: E0930 07:52:46.098266 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d3d1f847bbdb09ebb495e4366300be15fc716ea055c4f1090d869713fd4a638\": container with ID starting with 1d3d1f847bbdb09ebb495e4366300be15fc716ea055c4f1090d869713fd4a638 not found: ID does not exist" containerID="1d3d1f847bbdb09ebb495e4366300be15fc716ea055c4f1090d869713fd4a638" Sep 30 07:52:46 crc kubenswrapper[4760]: I0930 07:52:46.098345 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d3d1f847bbdb09ebb495e4366300be15fc716ea055c4f1090d869713fd4a638"} err="failed to get container status \"1d3d1f847bbdb09ebb495e4366300be15fc716ea055c4f1090d869713fd4a638\": rpc error: code = NotFound desc = could not find container \"1d3d1f847bbdb09ebb495e4366300be15fc716ea055c4f1090d869713fd4a638\": container with ID starting with 1d3d1f847bbdb09ebb495e4366300be15fc716ea055c4f1090d869713fd4a638 not found: ID does not exist" Sep 30 07:52:46 crc kubenswrapper[4760]: I0930 07:52:46.127469 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 07:52:46 crc kubenswrapper[4760]: I0930 07:52:46.138996 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 07:52:46 crc kubenswrapper[4760]: I0930 07:52:46.153484 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 07:52:46 crc kubenswrapper[4760]: E0930 07:52:46.154073 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0976286-8bdd-43e7-894c-3e899a323ee7" containerName="nova-scheduler-scheduler" Sep 30 07:52:46 crc kubenswrapper[4760]: I0930 07:52:46.154098 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0976286-8bdd-43e7-894c-3e899a323ee7" containerName="nova-scheduler-scheduler" Sep 30 07:52:46 crc kubenswrapper[4760]: I0930 07:52:46.154382 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0976286-8bdd-43e7-894c-3e899a323ee7" containerName="nova-scheduler-scheduler" Sep 30 07:52:46 crc kubenswrapper[4760]: I0930 07:52:46.155181 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 07:52:46 crc kubenswrapper[4760]: I0930 07:52:46.161555 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 07:52:46 crc kubenswrapper[4760]: I0930 07:52:46.170691 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 07:52:46 crc kubenswrapper[4760]: I0930 07:52:46.235600 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:52:46 crc kubenswrapper[4760]: I0930 07:52:46.284245 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f8814ba-633a-440c-b866-98025a753fb1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6f8814ba-633a-440c-b866-98025a753fb1\") " pod="openstack/nova-scheduler-0" Sep 30 07:52:46 crc kubenswrapper[4760]: I0930 07:52:46.284390 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mvc2\" (UniqueName: \"kubernetes.io/projected/6f8814ba-633a-440c-b866-98025a753fb1-kube-api-access-8mvc2\") pod \"nova-scheduler-0\" (UID: \"6f8814ba-633a-440c-b866-98025a753fb1\") " pod="openstack/nova-scheduler-0" Sep 30 07:52:46 crc kubenswrapper[4760]: I0930 07:52:46.284411 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f8814ba-633a-440c-b866-98025a753fb1-config-data\") pod \"nova-scheduler-0\" (UID: \"6f8814ba-633a-440c-b866-98025a753fb1\") " pod="openstack/nova-scheduler-0" Sep 30 07:52:46 crc kubenswrapper[4760]: I0930 07:52:46.386251 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mvc2\" (UniqueName: \"kubernetes.io/projected/6f8814ba-633a-440c-b866-98025a753fb1-kube-api-access-8mvc2\") pod \"nova-scheduler-0\" (UID: \"6f8814ba-633a-440c-b866-98025a753fb1\") " pod="openstack/nova-scheduler-0" Sep 30 07:52:46 crc kubenswrapper[4760]: I0930 07:52:46.386330 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f8814ba-633a-440c-b866-98025a753fb1-config-data\") pod \"nova-scheduler-0\" (UID: \"6f8814ba-633a-440c-b866-98025a753fb1\") " pod="openstack/nova-scheduler-0" Sep 30 07:52:46 crc kubenswrapper[4760]: I0930 07:52:46.386453 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f8814ba-633a-440c-b866-98025a753fb1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6f8814ba-633a-440c-b866-98025a753fb1\") " pod="openstack/nova-scheduler-0" Sep 30 07:52:46 crc kubenswrapper[4760]: I0930 07:52:46.392515 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f8814ba-633a-440c-b866-98025a753fb1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6f8814ba-633a-440c-b866-98025a753fb1\") " pod="openstack/nova-scheduler-0" Sep 30 07:52:46 crc kubenswrapper[4760]: I0930 07:52:46.393029 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f8814ba-633a-440c-b866-98025a753fb1-config-data\") pod \"nova-scheduler-0\" (UID: \"6f8814ba-633a-440c-b866-98025a753fb1\") " pod="openstack/nova-scheduler-0" Sep 30 07:52:46 crc kubenswrapper[4760]: I0930 07:52:46.418757 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mvc2\" (UniqueName: \"kubernetes.io/projected/6f8814ba-633a-440c-b866-98025a753fb1-kube-api-access-8mvc2\") pod \"nova-scheduler-0\" (UID: \"6f8814ba-633a-440c-b866-98025a753fb1\") " pod="openstack/nova-scheduler-0" Sep 30 07:52:46 crc kubenswrapper[4760]: I0930 07:52:46.470501 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 07:52:47 crc kubenswrapper[4760]: I0930 07:52:47.022398 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 07:52:47 crc kubenswrapper[4760]: I0930 07:52:47.099991 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaa10928-35f2-46a8-82e2-b6569a81187d" path="/var/lib/kubelet/pods/eaa10928-35f2-46a8-82e2-b6569a81187d/volumes" Sep 30 07:52:47 crc kubenswrapper[4760]: I0930 07:52:47.101282 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0976286-8bdd-43e7-894c-3e899a323ee7" path="/var/lib/kubelet/pods/f0976286-8bdd-43e7-894c-3e899a323ee7/volumes" Sep 30 07:52:47 crc kubenswrapper[4760]: I0930 07:52:47.102051 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ce11abc-b049-4106-b431-7cb6d620a2f3","Type":"ContainerStarted","Data":"af19916cab8a292013e130272970d4bf4e0d3924b353de3349aba95e7a0ac52d"} Sep 30 07:52:47 crc kubenswrapper[4760]: I0930 07:52:47.103319 4760 generic.go:334] "Generic (PLEG): container finished" podID="93652508-9731-4349-b689-3d7cda972bda" containerID="897c1c308c3fba3c7cfa72f91e90072ab60375c98b82465d97aec0982d58a466" exitCode=0 Sep 30 07:52:47 crc kubenswrapper[4760]: I0930 07:52:47.103380 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"93652508-9731-4349-b689-3d7cda972bda","Type":"ContainerDied","Data":"897c1c308c3fba3c7cfa72f91e90072ab60375c98b82465d97aec0982d58a466"} Sep 30 07:52:47 crc kubenswrapper[4760]: I0930 07:52:47.108705 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6f8814ba-633a-440c-b866-98025a753fb1","Type":"ContainerStarted","Data":"2217fe83f9b931e95eb76f4ac62657717ded36c809318295414b722ce58d630e"} Sep 30 07:52:47 crc kubenswrapper[4760]: I0930 07:52:47.647177 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 07:52:47 crc kubenswrapper[4760]: I0930 07:52:47.713903 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93652508-9731-4349-b689-3d7cda972bda-config-data\") pod \"93652508-9731-4349-b689-3d7cda972bda\" (UID: \"93652508-9731-4349-b689-3d7cda972bda\") " Sep 30 07:52:47 crc kubenswrapper[4760]: I0930 07:52:47.714012 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93652508-9731-4349-b689-3d7cda972bda-logs\") pod \"93652508-9731-4349-b689-3d7cda972bda\" (UID: \"93652508-9731-4349-b689-3d7cda972bda\") " Sep 30 07:52:47 crc kubenswrapper[4760]: I0930 07:52:47.714176 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93652508-9731-4349-b689-3d7cda972bda-combined-ca-bundle\") pod \"93652508-9731-4349-b689-3d7cda972bda\" (UID: \"93652508-9731-4349-b689-3d7cda972bda\") " Sep 30 07:52:47 crc kubenswrapper[4760]: I0930 07:52:47.714224 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkbnp\" (UniqueName: \"kubernetes.io/projected/93652508-9731-4349-b689-3d7cda972bda-kube-api-access-lkbnp\") pod \"93652508-9731-4349-b689-3d7cda972bda\" (UID: \"93652508-9731-4349-b689-3d7cda972bda\") " Sep 30 07:52:47 crc kubenswrapper[4760]: I0930 07:52:47.714805 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93652508-9731-4349-b689-3d7cda972bda-logs" (OuterVolumeSpecName: "logs") pod "93652508-9731-4349-b689-3d7cda972bda" (UID: "93652508-9731-4349-b689-3d7cda972bda"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:52:47 crc kubenswrapper[4760]: I0930 07:52:47.720042 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93652508-9731-4349-b689-3d7cda972bda-kube-api-access-lkbnp" (OuterVolumeSpecName: "kube-api-access-lkbnp") pod "93652508-9731-4349-b689-3d7cda972bda" (UID: "93652508-9731-4349-b689-3d7cda972bda"). InnerVolumeSpecName "kube-api-access-lkbnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:52:47 crc kubenswrapper[4760]: I0930 07:52:47.745777 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93652508-9731-4349-b689-3d7cda972bda-config-data" (OuterVolumeSpecName: "config-data") pod "93652508-9731-4349-b689-3d7cda972bda" (UID: "93652508-9731-4349-b689-3d7cda972bda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:47 crc kubenswrapper[4760]: I0930 07:52:47.761222 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93652508-9731-4349-b689-3d7cda972bda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93652508-9731-4349-b689-3d7cda972bda" (UID: "93652508-9731-4349-b689-3d7cda972bda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:47 crc kubenswrapper[4760]: I0930 07:52:47.816720 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93652508-9731-4349-b689-3d7cda972bda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:47 crc kubenswrapper[4760]: I0930 07:52:47.816766 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkbnp\" (UniqueName: \"kubernetes.io/projected/93652508-9731-4349-b689-3d7cda972bda-kube-api-access-lkbnp\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:47 crc kubenswrapper[4760]: I0930 07:52:47.816779 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93652508-9731-4349-b689-3d7cda972bda-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:47 crc kubenswrapper[4760]: I0930 07:52:47.816787 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93652508-9731-4349-b689-3d7cda972bda-logs\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.124771 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.124781 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"93652508-9731-4349-b689-3d7cda972bda","Type":"ContainerDied","Data":"a75ba9a1c8b4a3580da473e55cf5e4264126e35640f9a972a09cdd9611a768c5"} Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.125196 4760 scope.go:117] "RemoveContainer" containerID="897c1c308c3fba3c7cfa72f91e90072ab60375c98b82465d97aec0982d58a466" Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.127291 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6f8814ba-633a-440c-b866-98025a753fb1","Type":"ContainerStarted","Data":"04f40ee3c2eaa6a66fd5a6ed291a04d424bb7a6a2fdb158edfeb7c6e0045e8cf"} Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.131615 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ce11abc-b049-4106-b431-7cb6d620a2f3","Type":"ContainerStarted","Data":"02ba7e93747e249be16d4aa877f2f2c1f41b767ae79a05cc139c2571fb2e0073"} Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.131651 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ce11abc-b049-4106-b431-7cb6d620a2f3","Type":"ContainerStarted","Data":"fecbb134fb073373a1423db95dff2f1c1141a7db0ca2ef3eb7873949e18f9742"} Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.148506 4760 scope.go:117] "RemoveContainer" containerID="50b30a83024ea4144118dad1a713f5a7170e709e15d89cbae2ff7f08087ab03a" Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.204014 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.203986011 podStartE2EDuration="2.203986011s" podCreationTimestamp="2025-09-30 07:52:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:52:48.150956028 +0000 UTC m=+1153.793862440" watchObservedRunningTime="2025-09-30 07:52:48.203986011 +0000 UTC m=+1153.846892433" Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.227661 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.244845 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.252948 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 07:52:48 crc kubenswrapper[4760]: E0930 07:52:48.253470 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93652508-9731-4349-b689-3d7cda972bda" containerName="nova-api-api" Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.253495 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="93652508-9731-4349-b689-3d7cda972bda" containerName="nova-api-api" Sep 30 07:52:48 crc kubenswrapper[4760]: E0930 07:52:48.253518 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93652508-9731-4349-b689-3d7cda972bda" containerName="nova-api-log" Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.253564 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="93652508-9731-4349-b689-3d7cda972bda" containerName="nova-api-log" Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.253810 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="93652508-9731-4349-b689-3d7cda972bda" containerName="nova-api-api" Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.253832 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="93652508-9731-4349-b689-3d7cda972bda" containerName="nova-api-log" Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.255137 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.257821 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.262501 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.326414 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5411630-4844-48cd-b1be-daebd1ee3e0e-logs\") pod \"nova-api-0\" (UID: \"f5411630-4844-48cd-b1be-daebd1ee3e0e\") " pod="openstack/nova-api-0" Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.326457 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5411630-4844-48cd-b1be-daebd1ee3e0e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f5411630-4844-48cd-b1be-daebd1ee3e0e\") " pod="openstack/nova-api-0" Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.326490 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5411630-4844-48cd-b1be-daebd1ee3e0e-config-data\") pod \"nova-api-0\" (UID: \"f5411630-4844-48cd-b1be-daebd1ee3e0e\") " pod="openstack/nova-api-0" Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.326564 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b7c4\" (UniqueName: \"kubernetes.io/projected/f5411630-4844-48cd-b1be-daebd1ee3e0e-kube-api-access-5b7c4\") pod \"nova-api-0\" (UID: \"f5411630-4844-48cd-b1be-daebd1ee3e0e\") " pod="openstack/nova-api-0" Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.428630 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5411630-4844-48cd-b1be-daebd1ee3e0e-logs\") pod \"nova-api-0\" (UID: \"f5411630-4844-48cd-b1be-daebd1ee3e0e\") " pod="openstack/nova-api-0" Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.428676 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5411630-4844-48cd-b1be-daebd1ee3e0e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f5411630-4844-48cd-b1be-daebd1ee3e0e\") " pod="openstack/nova-api-0" Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.428703 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5411630-4844-48cd-b1be-daebd1ee3e0e-config-data\") pod \"nova-api-0\" (UID: \"f5411630-4844-48cd-b1be-daebd1ee3e0e\") " pod="openstack/nova-api-0" Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.428769 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b7c4\" (UniqueName: \"kubernetes.io/projected/f5411630-4844-48cd-b1be-daebd1ee3e0e-kube-api-access-5b7c4\") pod \"nova-api-0\" (UID: \"f5411630-4844-48cd-b1be-daebd1ee3e0e\") " pod="openstack/nova-api-0" Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.429413 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5411630-4844-48cd-b1be-daebd1ee3e0e-logs\") pod \"nova-api-0\" (UID: \"f5411630-4844-48cd-b1be-daebd1ee3e0e\") " pod="openstack/nova-api-0" Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.434345 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5411630-4844-48cd-b1be-daebd1ee3e0e-config-data\") pod \"nova-api-0\" (UID: \"f5411630-4844-48cd-b1be-daebd1ee3e0e\") " pod="openstack/nova-api-0" Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.436813 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5411630-4844-48cd-b1be-daebd1ee3e0e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f5411630-4844-48cd-b1be-daebd1ee3e0e\") " pod="openstack/nova-api-0" Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.449579 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b7c4\" (UniqueName: \"kubernetes.io/projected/f5411630-4844-48cd-b1be-daebd1ee3e0e-kube-api-access-5b7c4\") pod \"nova-api-0\" (UID: \"f5411630-4844-48cd-b1be-daebd1ee3e0e\") " pod="openstack/nova-api-0" Sep 30 07:52:48 crc kubenswrapper[4760]: I0930 07:52:48.575795 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 07:52:49 crc kubenswrapper[4760]: I0930 07:52:49.022422 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 07:52:49 crc kubenswrapper[4760]: I0930 07:52:49.092734 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93652508-9731-4349-b689-3d7cda972bda" path="/var/lib/kubelet/pods/93652508-9731-4349-b689-3d7cda972bda/volumes" Sep 30 07:52:49 crc kubenswrapper[4760]: I0930 07:52:49.144558 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f5411630-4844-48cd-b1be-daebd1ee3e0e","Type":"ContainerStarted","Data":"0098ab2f333aada2629dd5464fb468f4308a0180759dc3bb1c44350f3b869cf3"} Sep 30 07:52:49 crc kubenswrapper[4760]: I0930 07:52:49.148534 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ce11abc-b049-4106-b431-7cb6d620a2f3","Type":"ContainerStarted","Data":"024f0ff9f41ca6b0b8458be611a149892d4390d5ead651580c001b06a18012bd"} Sep 30 07:52:50 crc kubenswrapper[4760]: I0930 07:52:50.158778 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f5411630-4844-48cd-b1be-daebd1ee3e0e","Type":"ContainerStarted","Data":"851691d291f5e37354823883abe9b5439fd28670aab4d745b3d899cb068b41a0"} Sep 30 07:52:50 crc kubenswrapper[4760]: I0930 07:52:50.159244 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f5411630-4844-48cd-b1be-daebd1ee3e0e","Type":"ContainerStarted","Data":"9f8c8c5ba0d01cf0e648da859dd4514075e80f58673a32f4026e75d8efa717ef"} Sep 30 07:52:50 crc kubenswrapper[4760]: I0930 07:52:50.181120 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.181095133 podStartE2EDuration="2.181095133s" podCreationTimestamp="2025-09-30 07:52:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:52:50.174214497 +0000 UTC m=+1155.817120909" watchObservedRunningTime="2025-09-30 07:52:50.181095133 +0000 UTC m=+1155.824001545" Sep 30 07:52:51 crc kubenswrapper[4760]: I0930 07:52:51.177450 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ce11abc-b049-4106-b431-7cb6d620a2f3","Type":"ContainerStarted","Data":"57b5645477b0f739e66f95c6f03d57d10ad43a729c03c9ec2ee15ed9600d3c68"} Sep 30 07:52:51 crc kubenswrapper[4760]: I0930 07:52:51.178857 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 07:52:51 crc kubenswrapper[4760]: I0930 07:52:51.203067 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.209487254 podStartE2EDuration="6.203036499s" podCreationTimestamp="2025-09-30 07:52:45 +0000 UTC" firstStartedPulling="2025-09-30 07:52:46.226992174 +0000 UTC m=+1151.869898606" lastFinishedPulling="2025-09-30 07:52:50.220541429 +0000 UTC m=+1155.863447851" observedRunningTime="2025-09-30 07:52:51.198002701 +0000 UTC m=+1156.840909123" watchObservedRunningTime="2025-09-30 07:52:51.203036499 +0000 UTC m=+1156.845942911" Sep 30 07:52:51 crc kubenswrapper[4760]: I0930 07:52:51.471027 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 07:52:51 crc kubenswrapper[4760]: I0930 07:52:51.508960 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Sep 30 07:52:52 crc kubenswrapper[4760]: I0930 07:52:52.448156 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 30 07:52:56 crc kubenswrapper[4760]: I0930 07:52:56.471535 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 07:52:56 crc kubenswrapper[4760]: I0930 07:52:56.520812 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 07:52:57 crc kubenswrapper[4760]: I0930 07:52:57.265654 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 07:52:58 crc kubenswrapper[4760]: I0930 07:52:58.576362 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 07:52:58 crc kubenswrapper[4760]: I0930 07:52:58.576458 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 07:52:59 crc kubenswrapper[4760]: I0930 07:52:59.658642 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f5411630-4844-48cd-b1be-daebd1ee3e0e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.212:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 07:52:59 crc kubenswrapper[4760]: I0930 07:52:59.658732 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f5411630-4844-48cd-b1be-daebd1ee3e0e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.212:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 07:53:00 crc kubenswrapper[4760]: E0930 07:53:00.727631 4760 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/4dc8df026d2575a6045fa80eecd332b33969f5609f03f1bbad34318840060afd/diff" to get inode usage: stat /var/lib/containers/storage/overlay/4dc8df026d2575a6045fa80eecd332b33969f5609f03f1bbad34318840060afd/diff: no such file or directory, extraDiskErr: Sep 30 07:53:04 crc kubenswrapper[4760]: E0930 07:53:04.215940 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56921bc4_752f_4605_9bf5_af56dbd217d4.slice/crio-22d78f6325b5cbcaeb5aee35041eaeded67bfd03fd7c20d2b14b4614a1f95ee4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93652508_9731_4349_b689_3d7cda972bda.slice/crio-conmon-897c1c308c3fba3c7cfa72f91e90072ab60375c98b82465d97aec0982d58a466.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93652508_9731_4349_b689_3d7cda972bda.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93652508_9731_4349_b689_3d7cda972bda.slice/crio-a75ba9a1c8b4a3580da473e55cf5e4264126e35640f9a972a09cdd9611a768c5\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5fec5af_a95f_4659_bf0f_e57806fa05c7.slice/crio-e5d0927efdc95d1021664f825a01aba7d3eec632230ec3412f5085ea4af22e0e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5fec5af_a95f_4659_bf0f_e57806fa05c7.slice/crio-conmon-e5d0927efdc95d1021664f825a01aba7d3eec632230ec3412f5085ea4af22e0e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93652508_9731_4349_b689_3d7cda972bda.slice/crio-897c1c308c3fba3c7cfa72f91e90072ab60375c98b82465d97aec0982d58a466.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56921bc4_752f_4605_9bf5_af56dbd217d4.slice/crio-conmon-22d78f6325b5cbcaeb5aee35041eaeded67bfd03fd7c20d2b14b4614a1f95ee4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0976286_8bdd_43e7_894c_3e899a323ee7.slice\": RecentStats: unable to find data in memory cache]" Sep 30 07:53:04 crc kubenswrapper[4760]: I0930 07:53:04.312680 4760 generic.go:334] "Generic (PLEG): container finished" podID="56921bc4-752f-4605-9bf5-af56dbd217d4" containerID="22d78f6325b5cbcaeb5aee35041eaeded67bfd03fd7c20d2b14b4614a1f95ee4" exitCode=137 Sep 30 07:53:04 crc kubenswrapper[4760]: I0930 07:53:04.312828 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56921bc4-752f-4605-9bf5-af56dbd217d4","Type":"ContainerDied","Data":"22d78f6325b5cbcaeb5aee35041eaeded67bfd03fd7c20d2b14b4614a1f95ee4"} Sep 30 07:53:04 crc kubenswrapper[4760]: I0930 07:53:04.314498 4760 generic.go:334] "Generic (PLEG): container finished" podID="d5fec5af-a95f-4659-bf0f-e57806fa05c7" containerID="e5d0927efdc95d1021664f825a01aba7d3eec632230ec3412f5085ea4af22e0e" exitCode=137 Sep 30 07:53:04 crc kubenswrapper[4760]: I0930 07:53:04.314533 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d5fec5af-a95f-4659-bf0f-e57806fa05c7","Type":"ContainerDied","Data":"e5d0927efdc95d1021664f825a01aba7d3eec632230ec3412f5085ea4af22e0e"} Sep 30 07:53:04 crc kubenswrapper[4760]: I0930 07:53:04.455350 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 07:53:04 crc kubenswrapper[4760]: I0930 07:53:04.460656 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:53:04 crc kubenswrapper[4760]: I0930 07:53:04.645785 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7btjc\" (UniqueName: \"kubernetes.io/projected/56921bc4-752f-4605-9bf5-af56dbd217d4-kube-api-access-7btjc\") pod \"56921bc4-752f-4605-9bf5-af56dbd217d4\" (UID: \"56921bc4-752f-4605-9bf5-af56dbd217d4\") " Sep 30 07:53:04 crc kubenswrapper[4760]: I0930 07:53:04.645876 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5fec5af-a95f-4659-bf0f-e57806fa05c7-config-data\") pod \"d5fec5af-a95f-4659-bf0f-e57806fa05c7\" (UID: \"d5fec5af-a95f-4659-bf0f-e57806fa05c7\") " Sep 30 07:53:04 crc kubenswrapper[4760]: I0930 07:53:04.646201 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvzx2\" (UniqueName: \"kubernetes.io/projected/d5fec5af-a95f-4659-bf0f-e57806fa05c7-kube-api-access-pvzx2\") pod \"d5fec5af-a95f-4659-bf0f-e57806fa05c7\" (UID: \"d5fec5af-a95f-4659-bf0f-e57806fa05c7\") " Sep 30 07:53:04 crc kubenswrapper[4760]: I0930 07:53:04.646279 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56921bc4-752f-4605-9bf5-af56dbd217d4-config-data\") pod \"56921bc4-752f-4605-9bf5-af56dbd217d4\" (UID: \"56921bc4-752f-4605-9bf5-af56dbd217d4\") " Sep 30 07:53:04 crc kubenswrapper[4760]: I0930 07:53:04.646343 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56921bc4-752f-4605-9bf5-af56dbd217d4-combined-ca-bundle\") pod \"56921bc4-752f-4605-9bf5-af56dbd217d4\" (UID: \"56921bc4-752f-4605-9bf5-af56dbd217d4\") " Sep 30 07:53:04 crc kubenswrapper[4760]: I0930 07:53:04.646424 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5fec5af-a95f-4659-bf0f-e57806fa05c7-combined-ca-bundle\") pod \"d5fec5af-a95f-4659-bf0f-e57806fa05c7\" (UID: \"d5fec5af-a95f-4659-bf0f-e57806fa05c7\") " Sep 30 07:53:04 crc kubenswrapper[4760]: I0930 07:53:04.646511 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56921bc4-752f-4605-9bf5-af56dbd217d4-logs\") pod \"56921bc4-752f-4605-9bf5-af56dbd217d4\" (UID: \"56921bc4-752f-4605-9bf5-af56dbd217d4\") " Sep 30 07:53:04 crc kubenswrapper[4760]: I0930 07:53:04.647531 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56921bc4-752f-4605-9bf5-af56dbd217d4-logs" (OuterVolumeSpecName: "logs") pod "56921bc4-752f-4605-9bf5-af56dbd217d4" (UID: "56921bc4-752f-4605-9bf5-af56dbd217d4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:53:04 crc kubenswrapper[4760]: I0930 07:53:04.652469 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56921bc4-752f-4605-9bf5-af56dbd217d4-kube-api-access-7btjc" (OuterVolumeSpecName: "kube-api-access-7btjc") pod "56921bc4-752f-4605-9bf5-af56dbd217d4" (UID: "56921bc4-752f-4605-9bf5-af56dbd217d4"). InnerVolumeSpecName "kube-api-access-7btjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:53:04 crc kubenswrapper[4760]: I0930 07:53:04.654033 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5fec5af-a95f-4659-bf0f-e57806fa05c7-kube-api-access-pvzx2" (OuterVolumeSpecName: "kube-api-access-pvzx2") pod "d5fec5af-a95f-4659-bf0f-e57806fa05c7" (UID: "d5fec5af-a95f-4659-bf0f-e57806fa05c7"). InnerVolumeSpecName "kube-api-access-pvzx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:53:04 crc kubenswrapper[4760]: I0930 07:53:04.676608 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5fec5af-a95f-4659-bf0f-e57806fa05c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5fec5af-a95f-4659-bf0f-e57806fa05c7" (UID: "d5fec5af-a95f-4659-bf0f-e57806fa05c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:53:04 crc kubenswrapper[4760]: I0930 07:53:04.681634 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56921bc4-752f-4605-9bf5-af56dbd217d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56921bc4-752f-4605-9bf5-af56dbd217d4" (UID: "56921bc4-752f-4605-9bf5-af56dbd217d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:53:04 crc kubenswrapper[4760]: I0930 07:53:04.684511 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5fec5af-a95f-4659-bf0f-e57806fa05c7-config-data" (OuterVolumeSpecName: "config-data") pod "d5fec5af-a95f-4659-bf0f-e57806fa05c7" (UID: "d5fec5af-a95f-4659-bf0f-e57806fa05c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:53:04 crc kubenswrapper[4760]: I0930 07:53:04.687261 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56921bc4-752f-4605-9bf5-af56dbd217d4-config-data" (OuterVolumeSpecName: "config-data") pod "56921bc4-752f-4605-9bf5-af56dbd217d4" (UID: "56921bc4-752f-4605-9bf5-af56dbd217d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:53:04 crc kubenswrapper[4760]: I0930 07:53:04.749037 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvzx2\" (UniqueName: \"kubernetes.io/projected/d5fec5af-a95f-4659-bf0f-e57806fa05c7-kube-api-access-pvzx2\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:04 crc kubenswrapper[4760]: I0930 07:53:04.749073 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56921bc4-752f-4605-9bf5-af56dbd217d4-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:04 crc kubenswrapper[4760]: I0930 07:53:04.749086 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56921bc4-752f-4605-9bf5-af56dbd217d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:04 crc kubenswrapper[4760]: I0930 07:53:04.749100 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5fec5af-a95f-4659-bf0f-e57806fa05c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:04 crc kubenswrapper[4760]: I0930 07:53:04.749111 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56921bc4-752f-4605-9bf5-af56dbd217d4-logs\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:04 crc kubenswrapper[4760]: I0930 07:53:04.749124 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7btjc\" (UniqueName: \"kubernetes.io/projected/56921bc4-752f-4605-9bf5-af56dbd217d4-kube-api-access-7btjc\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:04 crc kubenswrapper[4760]: I0930 07:53:04.749138 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5fec5af-a95f-4659-bf0f-e57806fa05c7-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.329960 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56921bc4-752f-4605-9bf5-af56dbd217d4","Type":"ContainerDied","Data":"750be7609d607a2155869459923ab798128b6d19a18b0f849ce6f0327b67b9fe"} Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.330016 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.330469 4760 scope.go:117] "RemoveContainer" containerID="22d78f6325b5cbcaeb5aee35041eaeded67bfd03fd7c20d2b14b4614a1f95ee4" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.331580 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d5fec5af-a95f-4659-bf0f-e57806fa05c7","Type":"ContainerDied","Data":"33db13328c7baffaa81aca9f9823ec821246f06d983647243775dbb9ca0dfed3"} Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.331667 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.367134 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.380477 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.382521 4760 scope.go:117] "RemoveContainer" containerID="024d726c57758210a0b9cd176dec0dbf8a1feaa6e2df4cefe80f29d2802cf72e" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.398418 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.406907 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.409920 4760 scope.go:117] "RemoveContainer" containerID="e5d0927efdc95d1021664f825a01aba7d3eec632230ec3412f5085ea4af22e0e" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.429435 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 07:53:05 crc kubenswrapper[4760]: E0930 07:53:05.430112 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56921bc4-752f-4605-9bf5-af56dbd217d4" containerName="nova-metadata-metadata" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.430146 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="56921bc4-752f-4605-9bf5-af56dbd217d4" containerName="nova-metadata-metadata" Sep 30 07:53:05 crc kubenswrapper[4760]: E0930 07:53:05.430176 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5fec5af-a95f-4659-bf0f-e57806fa05c7" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.430188 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5fec5af-a95f-4659-bf0f-e57806fa05c7" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 07:53:05 crc kubenswrapper[4760]: E0930 07:53:05.430212 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56921bc4-752f-4605-9bf5-af56dbd217d4" containerName="nova-metadata-log" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.430223 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="56921bc4-752f-4605-9bf5-af56dbd217d4" containerName="nova-metadata-log" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.430565 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="56921bc4-752f-4605-9bf5-af56dbd217d4" containerName="nova-metadata-log" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.430610 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5fec5af-a95f-4659-bf0f-e57806fa05c7" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.430634 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="56921bc4-752f-4605-9bf5-af56dbd217d4" containerName="nova-metadata-metadata" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.431695 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.436531 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.436672 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.437083 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.444970 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.459410 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.461387 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.468835 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.469235 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.469278 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.564187 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ab1b57-8aaa-4360-b024-fa2142ebd994-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7ab1b57-8aaa-4360-b024-fa2142ebd994\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.564256 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d3f6c3d-070f-494e-8b47-856d54de039c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1d3f6c3d-070f-494e-8b47-856d54de039c\") " pod="openstack/nova-metadata-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.564296 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d3f6c3d-070f-494e-8b47-856d54de039c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1d3f6c3d-070f-494e-8b47-856d54de039c\") " pod="openstack/nova-metadata-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.564380 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7ab1b57-8aaa-4360-b024-fa2142ebd994-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7ab1b57-8aaa-4360-b024-fa2142ebd994\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.564701 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d3f6c3d-070f-494e-8b47-856d54de039c-config-data\") pod \"nova-metadata-0\" (UID: \"1d3f6c3d-070f-494e-8b47-856d54de039c\") " pod="openstack/nova-metadata-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.564856 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d3f6c3d-070f-494e-8b47-856d54de039c-logs\") pod \"nova-metadata-0\" (UID: \"1d3f6c3d-070f-494e-8b47-856d54de039c\") " pod="openstack/nova-metadata-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.564890 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ffm4\" (UniqueName: \"kubernetes.io/projected/1d3f6c3d-070f-494e-8b47-856d54de039c-kube-api-access-8ffm4\") pod \"nova-metadata-0\" (UID: \"1d3f6c3d-070f-494e-8b47-856d54de039c\") " pod="openstack/nova-metadata-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.564917 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7ab1b57-8aaa-4360-b024-fa2142ebd994-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7ab1b57-8aaa-4360-b024-fa2142ebd994\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.564964 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6jfv\" (UniqueName: \"kubernetes.io/projected/f7ab1b57-8aaa-4360-b024-fa2142ebd994-kube-api-access-h6jfv\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7ab1b57-8aaa-4360-b024-fa2142ebd994\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.565008 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ab1b57-8aaa-4360-b024-fa2142ebd994-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7ab1b57-8aaa-4360-b024-fa2142ebd994\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.666681 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d3f6c3d-070f-494e-8b47-856d54de039c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1d3f6c3d-070f-494e-8b47-856d54de039c\") " pod="openstack/nova-metadata-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.666744 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7ab1b57-8aaa-4360-b024-fa2142ebd994-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7ab1b57-8aaa-4360-b024-fa2142ebd994\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.666890 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d3f6c3d-070f-494e-8b47-856d54de039c-config-data\") pod \"nova-metadata-0\" (UID: \"1d3f6c3d-070f-494e-8b47-856d54de039c\") " pod="openstack/nova-metadata-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.666981 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d3f6c3d-070f-494e-8b47-856d54de039c-logs\") pod \"nova-metadata-0\" (UID: \"1d3f6c3d-070f-494e-8b47-856d54de039c\") " pod="openstack/nova-metadata-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.667026 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ffm4\" (UniqueName: \"kubernetes.io/projected/1d3f6c3d-070f-494e-8b47-856d54de039c-kube-api-access-8ffm4\") pod \"nova-metadata-0\" (UID: \"1d3f6c3d-070f-494e-8b47-856d54de039c\") " pod="openstack/nova-metadata-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.667079 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7ab1b57-8aaa-4360-b024-fa2142ebd994-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7ab1b57-8aaa-4360-b024-fa2142ebd994\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.667141 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6jfv\" (UniqueName: \"kubernetes.io/projected/f7ab1b57-8aaa-4360-b024-fa2142ebd994-kube-api-access-h6jfv\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7ab1b57-8aaa-4360-b024-fa2142ebd994\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.667183 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ab1b57-8aaa-4360-b024-fa2142ebd994-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7ab1b57-8aaa-4360-b024-fa2142ebd994\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.667259 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ab1b57-8aaa-4360-b024-fa2142ebd994-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7ab1b57-8aaa-4360-b024-fa2142ebd994\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.667346 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d3f6c3d-070f-494e-8b47-856d54de039c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1d3f6c3d-070f-494e-8b47-856d54de039c\") " pod="openstack/nova-metadata-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.668890 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d3f6c3d-070f-494e-8b47-856d54de039c-logs\") pod \"nova-metadata-0\" (UID: \"1d3f6c3d-070f-494e-8b47-856d54de039c\") " pod="openstack/nova-metadata-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.676552 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d3f6c3d-070f-494e-8b47-856d54de039c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1d3f6c3d-070f-494e-8b47-856d54de039c\") " pod="openstack/nova-metadata-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.681270 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ab1b57-8aaa-4360-b024-fa2142ebd994-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7ab1b57-8aaa-4360-b024-fa2142ebd994\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.682415 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d3f6c3d-070f-494e-8b47-856d54de039c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1d3f6c3d-070f-494e-8b47-856d54de039c\") " pod="openstack/nova-metadata-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.682981 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7ab1b57-8aaa-4360-b024-fa2142ebd994-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7ab1b57-8aaa-4360-b024-fa2142ebd994\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.683394 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d3f6c3d-070f-494e-8b47-856d54de039c-config-data\") pod \"nova-metadata-0\" (UID: \"1d3f6c3d-070f-494e-8b47-856d54de039c\") " pod="openstack/nova-metadata-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.685266 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7ab1b57-8aaa-4360-b024-fa2142ebd994-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7ab1b57-8aaa-4360-b024-fa2142ebd994\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.688933 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ab1b57-8aaa-4360-b024-fa2142ebd994-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7ab1b57-8aaa-4360-b024-fa2142ebd994\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.694620 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ffm4\" (UniqueName: \"kubernetes.io/projected/1d3f6c3d-070f-494e-8b47-856d54de039c-kube-api-access-8ffm4\") pod \"nova-metadata-0\" (UID: \"1d3f6c3d-070f-494e-8b47-856d54de039c\") " pod="openstack/nova-metadata-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.696507 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6jfv\" (UniqueName: \"kubernetes.io/projected/f7ab1b57-8aaa-4360-b024-fa2142ebd994-kube-api-access-h6jfv\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7ab1b57-8aaa-4360-b024-fa2142ebd994\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.763869 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:53:05 crc kubenswrapper[4760]: I0930 07:53:05.781254 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 07:53:06 crc kubenswrapper[4760]: I0930 07:53:06.338782 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 07:53:06 crc kubenswrapper[4760]: I0930 07:53:06.431665 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 07:53:07 crc kubenswrapper[4760]: I0930 07:53:07.084146 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56921bc4-752f-4605-9bf5-af56dbd217d4" path="/var/lib/kubelet/pods/56921bc4-752f-4605-9bf5-af56dbd217d4/volumes" Sep 30 07:53:07 crc kubenswrapper[4760]: I0930 07:53:07.085117 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5fec5af-a95f-4659-bf0f-e57806fa05c7" path="/var/lib/kubelet/pods/d5fec5af-a95f-4659-bf0f-e57806fa05c7/volumes" Sep 30 07:53:07 crc kubenswrapper[4760]: I0930 07:53:07.357399 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1d3f6c3d-070f-494e-8b47-856d54de039c","Type":"ContainerStarted","Data":"827afd0fe268d79933117c8266e31aabbecdf70bc47cc8ed1d11a50e306fd65f"} Sep 30 07:53:07 crc kubenswrapper[4760]: I0930 07:53:07.357441 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1d3f6c3d-070f-494e-8b47-856d54de039c","Type":"ContainerStarted","Data":"096ff25098794e61c05471779e2b1da5def0724c467733da81ab266ab873f450"} Sep 30 07:53:07 crc kubenswrapper[4760]: I0930 07:53:07.357452 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1d3f6c3d-070f-494e-8b47-856d54de039c","Type":"ContainerStarted","Data":"c3405b9dcfc01a70300d6bdf8a1d5458b69a7cf40cc3f07434b140729c5d2245"} Sep 30 07:53:07 crc kubenswrapper[4760]: I0930 07:53:07.359879 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f7ab1b57-8aaa-4360-b024-fa2142ebd994","Type":"ContainerStarted","Data":"ae466bd73187c04de00e6bc62fba90b6307b3afb72e42ec156aee2651470afeb"} Sep 30 07:53:07 crc kubenswrapper[4760]: I0930 07:53:07.360207 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f7ab1b57-8aaa-4360-b024-fa2142ebd994","Type":"ContainerStarted","Data":"03d983dd393ceb10130633217131a3a81977ca30e868737b06170ea966c3c77c"} Sep 30 07:53:07 crc kubenswrapper[4760]: I0930 07:53:07.389037 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.389013355 podStartE2EDuration="2.389013355s" podCreationTimestamp="2025-09-30 07:53:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:53:07.380139158 +0000 UTC m=+1173.023045560" watchObservedRunningTime="2025-09-30 07:53:07.389013355 +0000 UTC m=+1173.031919787" Sep 30 07:53:07 crc kubenswrapper[4760]: I0930 07:53:07.413268 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.413247444 podStartE2EDuration="2.413247444s" podCreationTimestamp="2025-09-30 07:53:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:53:07.412904525 +0000 UTC m=+1173.055810947" watchObservedRunningTime="2025-09-30 07:53:07.413247444 +0000 UTC m=+1173.056153856" Sep 30 07:53:08 crc kubenswrapper[4760]: I0930 07:53:08.579962 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 07:53:08 crc kubenswrapper[4760]: I0930 07:53:08.580780 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 07:53:08 crc kubenswrapper[4760]: I0930 07:53:08.581529 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 07:53:08 crc kubenswrapper[4760]: I0930 07:53:08.581906 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 07:53:08 crc kubenswrapper[4760]: I0930 07:53:08.583974 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 07:53:08 crc kubenswrapper[4760]: I0930 07:53:08.584617 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 07:53:08 crc kubenswrapper[4760]: I0930 07:53:08.862494 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-fdkh2"] Sep 30 07:53:08 crc kubenswrapper[4760]: I0930 07:53:08.868573 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" Sep 30 07:53:08 crc kubenswrapper[4760]: I0930 07:53:08.892703 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-fdkh2"] Sep 30 07:53:09 crc kubenswrapper[4760]: I0930 07:53:09.067120 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptnqk\" (UniqueName: \"kubernetes.io/projected/82d1309a-4ecf-430e-84ad-69622bb9d9a6-kube-api-access-ptnqk\") pod \"dnsmasq-dns-5c7b6c5df9-fdkh2\" (UID: \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" Sep 30 07:53:09 crc kubenswrapper[4760]: I0930 07:53:09.067258 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-fdkh2\" (UID: \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" Sep 30 07:53:09 crc kubenswrapper[4760]: I0930 07:53:09.067293 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-fdkh2\" (UID: \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" Sep 30 07:53:09 crc kubenswrapper[4760]: I0930 07:53:09.067409 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-fdkh2\" (UID: \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" Sep 30 07:53:09 crc kubenswrapper[4760]: I0930 07:53:09.067449 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-config\") pod \"dnsmasq-dns-5c7b6c5df9-fdkh2\" (UID: \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" Sep 30 07:53:09 crc kubenswrapper[4760]: I0930 07:53:09.067512 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-fdkh2\" (UID: \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" Sep 30 07:53:09 crc kubenswrapper[4760]: I0930 07:53:09.168971 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-fdkh2\" (UID: \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" Sep 30 07:53:09 crc kubenswrapper[4760]: I0930 07:53:09.169016 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-fdkh2\" (UID: \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" Sep 30 07:53:09 crc kubenswrapper[4760]: I0930 07:53:09.169090 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-fdkh2\" (UID: \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" Sep 30 07:53:09 crc kubenswrapper[4760]: I0930 07:53:09.169120 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-config\") pod \"dnsmasq-dns-5c7b6c5df9-fdkh2\" (UID: \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" Sep 30 07:53:09 crc kubenswrapper[4760]: I0930 07:53:09.169919 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-fdkh2\" (UID: \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" Sep 30 07:53:09 crc kubenswrapper[4760]: I0930 07:53:09.169938 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-fdkh2\" (UID: \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" Sep 30 07:53:09 crc kubenswrapper[4760]: I0930 07:53:09.170114 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-fdkh2\" (UID: \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" Sep 30 07:53:09 crc kubenswrapper[4760]: I0930 07:53:09.170162 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptnqk\" (UniqueName: \"kubernetes.io/projected/82d1309a-4ecf-430e-84ad-69622bb9d9a6-kube-api-access-ptnqk\") pod \"dnsmasq-dns-5c7b6c5df9-fdkh2\" (UID: \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" Sep 30 07:53:09 crc kubenswrapper[4760]: I0930 07:53:09.170163 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-config\") pod \"dnsmasq-dns-5c7b6c5df9-fdkh2\" (UID: \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" Sep 30 07:53:09 crc kubenswrapper[4760]: I0930 07:53:09.170243 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-fdkh2\" (UID: \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" Sep 30 07:53:09 crc kubenswrapper[4760]: I0930 07:53:09.170803 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-fdkh2\" (UID: \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" Sep 30 07:53:09 crc kubenswrapper[4760]: I0930 07:53:09.192869 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptnqk\" (UniqueName: \"kubernetes.io/projected/82d1309a-4ecf-430e-84ad-69622bb9d9a6-kube-api-access-ptnqk\") pod \"dnsmasq-dns-5c7b6c5df9-fdkh2\" (UID: \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" Sep 30 07:53:09 crc kubenswrapper[4760]: I0930 07:53:09.222870 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" Sep 30 07:53:09 crc kubenswrapper[4760]: I0930 07:53:09.738032 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-fdkh2"] Sep 30 07:53:09 crc kubenswrapper[4760]: W0930 07:53:09.743749 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82d1309a_4ecf_430e_84ad_69622bb9d9a6.slice/crio-d04cb692e2f61b5eeaddeb2cd80a94a32ddb5e3d8c1f16d488d78d402b8315e0 WatchSource:0}: Error finding container d04cb692e2f61b5eeaddeb2cd80a94a32ddb5e3d8c1f16d488d78d402b8315e0: Status 404 returned error can't find the container with id d04cb692e2f61b5eeaddeb2cd80a94a32ddb5e3d8c1f16d488d78d402b8315e0 Sep 30 07:53:10 crc kubenswrapper[4760]: I0930 07:53:10.396551 4760 generic.go:334] "Generic (PLEG): container finished" podID="82d1309a-4ecf-430e-84ad-69622bb9d9a6" containerID="bdc5eecdf5d13a7cc6cdb9204f36b34e053fbd9e48a24fd4491f5e1ebe5b269d" exitCode=0 Sep 30 07:53:10 crc kubenswrapper[4760]: I0930 07:53:10.396612 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" event={"ID":"82d1309a-4ecf-430e-84ad-69622bb9d9a6","Type":"ContainerDied","Data":"bdc5eecdf5d13a7cc6cdb9204f36b34e053fbd9e48a24fd4491f5e1ebe5b269d"} Sep 30 07:53:10 crc kubenswrapper[4760]: I0930 07:53:10.396859 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" event={"ID":"82d1309a-4ecf-430e-84ad-69622bb9d9a6","Type":"ContainerStarted","Data":"d04cb692e2f61b5eeaddeb2cd80a94a32ddb5e3d8c1f16d488d78d402b8315e0"} Sep 30 07:53:10 crc kubenswrapper[4760]: I0930 07:53:10.764060 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:53:10 crc kubenswrapper[4760]: I0930 07:53:10.782919 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 07:53:10 crc kubenswrapper[4760]: I0930 07:53:10.782963 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 07:53:11 crc kubenswrapper[4760]: I0930 07:53:11.120234 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:53:11 crc kubenswrapper[4760]: I0930 07:53:11.121151 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ce11abc-b049-4106-b431-7cb6d620a2f3" containerName="ceilometer-notification-agent" containerID="cri-o://02ba7e93747e249be16d4aa877f2f2c1f41b767ae79a05cc139c2571fb2e0073" gracePeriod=30 Sep 30 07:53:11 crc kubenswrapper[4760]: I0930 07:53:11.121148 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ce11abc-b049-4106-b431-7cb6d620a2f3" containerName="sg-core" containerID="cri-o://024f0ff9f41ca6b0b8458be611a149892d4390d5ead651580c001b06a18012bd" gracePeriod=30 Sep 30 07:53:11 crc kubenswrapper[4760]: I0930 07:53:11.121232 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ce11abc-b049-4106-b431-7cb6d620a2f3" containerName="proxy-httpd" containerID="cri-o://57b5645477b0f739e66f95c6f03d57d10ad43a729c03c9ec2ee15ed9600d3c68" gracePeriod=30 Sep 30 07:53:11 crc kubenswrapper[4760]: I0930 07:53:11.121363 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ce11abc-b049-4106-b431-7cb6d620a2f3" containerName="ceilometer-central-agent" containerID="cri-o://fecbb134fb073373a1423db95dff2f1c1141a7db0ca2ef3eb7873949e18f9742" gracePeriod=30 Sep 30 07:53:11 crc kubenswrapper[4760]: I0930 07:53:11.152611 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="0ce11abc-b049-4106-b431-7cb6d620a2f3" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Sep 30 07:53:11 crc kubenswrapper[4760]: I0930 07:53:11.425993 4760 generic.go:334] "Generic (PLEG): container finished" podID="0ce11abc-b049-4106-b431-7cb6d620a2f3" containerID="57b5645477b0f739e66f95c6f03d57d10ad43a729c03c9ec2ee15ed9600d3c68" exitCode=0 Sep 30 07:53:11 crc kubenswrapper[4760]: I0930 07:53:11.426038 4760 generic.go:334] "Generic (PLEG): container finished" podID="0ce11abc-b049-4106-b431-7cb6d620a2f3" containerID="024f0ff9f41ca6b0b8458be611a149892d4390d5ead651580c001b06a18012bd" exitCode=2 Sep 30 07:53:11 crc kubenswrapper[4760]: I0930 07:53:11.426047 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ce11abc-b049-4106-b431-7cb6d620a2f3","Type":"ContainerDied","Data":"57b5645477b0f739e66f95c6f03d57d10ad43a729c03c9ec2ee15ed9600d3c68"} Sep 30 07:53:11 crc kubenswrapper[4760]: I0930 07:53:11.426109 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ce11abc-b049-4106-b431-7cb6d620a2f3","Type":"ContainerDied","Data":"024f0ff9f41ca6b0b8458be611a149892d4390d5ead651580c001b06a18012bd"} Sep 30 07:53:11 crc kubenswrapper[4760]: I0930 07:53:11.430877 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" event={"ID":"82d1309a-4ecf-430e-84ad-69622bb9d9a6","Type":"ContainerStarted","Data":"74a99f7cbd2e546d79211602bf2a603b54e89b44cf72780a50b070ef800208ae"} Sep 30 07:53:11 crc kubenswrapper[4760]: I0930 07:53:11.431037 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" Sep 30 07:53:11 crc kubenswrapper[4760]: I0930 07:53:11.466509 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" podStartSLOduration=3.466484767 podStartE2EDuration="3.466484767s" podCreationTimestamp="2025-09-30 07:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:53:11.461563771 +0000 UTC m=+1177.104470173" watchObservedRunningTime="2025-09-30 07:53:11.466484767 +0000 UTC m=+1177.109391209" Sep 30 07:53:11 crc kubenswrapper[4760]: I0930 07:53:11.548512 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 07:53:11 crc kubenswrapper[4760]: I0930 07:53:11.549229 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f5411630-4844-48cd-b1be-daebd1ee3e0e" containerName="nova-api-api" containerID="cri-o://851691d291f5e37354823883abe9b5439fd28670aab4d745b3d899cb068b41a0" gracePeriod=30 Sep 30 07:53:11 crc kubenswrapper[4760]: I0930 07:53:11.549400 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f5411630-4844-48cd-b1be-daebd1ee3e0e" containerName="nova-api-log" containerID="cri-o://9f8c8c5ba0d01cf0e648da859dd4514075e80f58673a32f4026e75d8efa717ef" gracePeriod=30 Sep 30 07:53:12 crc kubenswrapper[4760]: I0930 07:53:12.440570 4760 generic.go:334] "Generic (PLEG): container finished" podID="0ce11abc-b049-4106-b431-7cb6d620a2f3" containerID="fecbb134fb073373a1423db95dff2f1c1141a7db0ca2ef3eb7873949e18f9742" exitCode=0 Sep 30 07:53:12 crc kubenswrapper[4760]: I0930 07:53:12.440896 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ce11abc-b049-4106-b431-7cb6d620a2f3","Type":"ContainerDied","Data":"fecbb134fb073373a1423db95dff2f1c1141a7db0ca2ef3eb7873949e18f9742"} Sep 30 07:53:12 crc kubenswrapper[4760]: I0930 07:53:12.444761 4760 generic.go:334] "Generic (PLEG): container finished" podID="f5411630-4844-48cd-b1be-daebd1ee3e0e" containerID="9f8c8c5ba0d01cf0e648da859dd4514075e80f58673a32f4026e75d8efa717ef" exitCode=143 Sep 30 07:53:12 crc kubenswrapper[4760]: I0930 07:53:12.445587 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f5411630-4844-48cd-b1be-daebd1ee3e0e","Type":"ContainerDied","Data":"9f8c8c5ba0d01cf0e648da859dd4514075e80f58673a32f4026e75d8efa717ef"} Sep 30 07:53:14 crc kubenswrapper[4760]: I0930 07:53:14.484456 4760 generic.go:334] "Generic (PLEG): container finished" podID="0ce11abc-b049-4106-b431-7cb6d620a2f3" containerID="02ba7e93747e249be16d4aa877f2f2c1f41b767ae79a05cc139c2571fb2e0073" exitCode=0 Sep 30 07:53:14 crc kubenswrapper[4760]: I0930 07:53:14.484805 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ce11abc-b049-4106-b431-7cb6d620a2f3","Type":"ContainerDied","Data":"02ba7e93747e249be16d4aa877f2f2c1f41b767ae79a05cc139c2571fb2e0073"} Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.198504 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.206650 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.309022 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ce11abc-b049-4106-b431-7cb6d620a2f3-run-httpd\") pod \"0ce11abc-b049-4106-b431-7cb6d620a2f3\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.309058 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ce11abc-b049-4106-b431-7cb6d620a2f3-log-httpd\") pod \"0ce11abc-b049-4106-b431-7cb6d620a2f3\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.309127 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5411630-4844-48cd-b1be-daebd1ee3e0e-config-data\") pod \"f5411630-4844-48cd-b1be-daebd1ee3e0e\" (UID: \"f5411630-4844-48cd-b1be-daebd1ee3e0e\") " Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.309149 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-ceilometer-tls-certs\") pod \"0ce11abc-b049-4106-b431-7cb6d620a2f3\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.309186 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5411630-4844-48cd-b1be-daebd1ee3e0e-combined-ca-bundle\") pod \"f5411630-4844-48cd-b1be-daebd1ee3e0e\" (UID: \"f5411630-4844-48cd-b1be-daebd1ee3e0e\") " Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.309216 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5411630-4844-48cd-b1be-daebd1ee3e0e-logs\") pod \"f5411630-4844-48cd-b1be-daebd1ee3e0e\" (UID: \"f5411630-4844-48cd-b1be-daebd1ee3e0e\") " Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.309269 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-config-data\") pod \"0ce11abc-b049-4106-b431-7cb6d620a2f3\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.309320 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-combined-ca-bundle\") pod \"0ce11abc-b049-4106-b431-7cb6d620a2f3\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.309338 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-scripts\") pod \"0ce11abc-b049-4106-b431-7cb6d620a2f3\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.309376 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5b7c4\" (UniqueName: \"kubernetes.io/projected/f5411630-4844-48cd-b1be-daebd1ee3e0e-kube-api-access-5b7c4\") pod \"f5411630-4844-48cd-b1be-daebd1ee3e0e\" (UID: \"f5411630-4844-48cd-b1be-daebd1ee3e0e\") " Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.309404 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqfg8\" (UniqueName: \"kubernetes.io/projected/0ce11abc-b049-4106-b431-7cb6d620a2f3-kube-api-access-jqfg8\") pod \"0ce11abc-b049-4106-b431-7cb6d620a2f3\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.309424 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-sg-core-conf-yaml\") pod \"0ce11abc-b049-4106-b431-7cb6d620a2f3\" (UID: \"0ce11abc-b049-4106-b431-7cb6d620a2f3\") " Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.309621 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ce11abc-b049-4106-b431-7cb6d620a2f3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0ce11abc-b049-4106-b431-7cb6d620a2f3" (UID: "0ce11abc-b049-4106-b431-7cb6d620a2f3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.309769 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ce11abc-b049-4106-b431-7cb6d620a2f3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0ce11abc-b049-4106-b431-7cb6d620a2f3" (UID: "0ce11abc-b049-4106-b431-7cb6d620a2f3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.310630 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5411630-4844-48cd-b1be-daebd1ee3e0e-logs" (OuterVolumeSpecName: "logs") pod "f5411630-4844-48cd-b1be-daebd1ee3e0e" (UID: "f5411630-4844-48cd-b1be-daebd1ee3e0e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.311066 4760 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ce11abc-b049-4106-b431-7cb6d620a2f3-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.311087 4760 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ce11abc-b049-4106-b431-7cb6d620a2f3-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.311098 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5411630-4844-48cd-b1be-daebd1ee3e0e-logs\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.314995 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-scripts" (OuterVolumeSpecName: "scripts") pod "0ce11abc-b049-4106-b431-7cb6d620a2f3" (UID: "0ce11abc-b049-4106-b431-7cb6d620a2f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.333201 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ce11abc-b049-4106-b431-7cb6d620a2f3-kube-api-access-jqfg8" (OuterVolumeSpecName: "kube-api-access-jqfg8") pod "0ce11abc-b049-4106-b431-7cb6d620a2f3" (UID: "0ce11abc-b049-4106-b431-7cb6d620a2f3"). InnerVolumeSpecName "kube-api-access-jqfg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.355833 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5411630-4844-48cd-b1be-daebd1ee3e0e-kube-api-access-5b7c4" (OuterVolumeSpecName: "kube-api-access-5b7c4") pod "f5411630-4844-48cd-b1be-daebd1ee3e0e" (UID: "f5411630-4844-48cd-b1be-daebd1ee3e0e"). InnerVolumeSpecName "kube-api-access-5b7c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.370968 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5411630-4844-48cd-b1be-daebd1ee3e0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5411630-4844-48cd-b1be-daebd1ee3e0e" (UID: "f5411630-4844-48cd-b1be-daebd1ee3e0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.382884 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5411630-4844-48cd-b1be-daebd1ee3e0e-config-data" (OuterVolumeSpecName: "config-data") pod "f5411630-4844-48cd-b1be-daebd1ee3e0e" (UID: "f5411630-4844-48cd-b1be-daebd1ee3e0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.413279 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.413338 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5b7c4\" (UniqueName: \"kubernetes.io/projected/f5411630-4844-48cd-b1be-daebd1ee3e0e-kube-api-access-5b7c4\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.413349 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqfg8\" (UniqueName: \"kubernetes.io/projected/0ce11abc-b049-4106-b431-7cb6d620a2f3-kube-api-access-jqfg8\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.413357 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5411630-4844-48cd-b1be-daebd1ee3e0e-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.413366 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5411630-4844-48cd-b1be-daebd1ee3e0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.414379 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "0ce11abc-b049-4106-b431-7cb6d620a2f3" (UID: "0ce11abc-b049-4106-b431-7cb6d620a2f3"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.417685 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0ce11abc-b049-4106-b431-7cb6d620a2f3" (UID: "0ce11abc-b049-4106-b431-7cb6d620a2f3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.466406 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-config-data" (OuterVolumeSpecName: "config-data") pod "0ce11abc-b049-4106-b431-7cb6d620a2f3" (UID: "0ce11abc-b049-4106-b431-7cb6d620a2f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.495565 4760 generic.go:334] "Generic (PLEG): container finished" podID="f5411630-4844-48cd-b1be-daebd1ee3e0e" containerID="851691d291f5e37354823883abe9b5439fd28670aab4d745b3d899cb068b41a0" exitCode=0 Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.495622 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.495655 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f5411630-4844-48cd-b1be-daebd1ee3e0e","Type":"ContainerDied","Data":"851691d291f5e37354823883abe9b5439fd28670aab4d745b3d899cb068b41a0"} Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.495700 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f5411630-4844-48cd-b1be-daebd1ee3e0e","Type":"ContainerDied","Data":"0098ab2f333aada2629dd5464fb468f4308a0180759dc3bb1c44350f3b869cf3"} Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.495716 4760 scope.go:117] "RemoveContainer" containerID="851691d291f5e37354823883abe9b5439fd28670aab4d745b3d899cb068b41a0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.499474 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ce11abc-b049-4106-b431-7cb6d620a2f3" (UID: "0ce11abc-b049-4106-b431-7cb6d620a2f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.499580 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ce11abc-b049-4106-b431-7cb6d620a2f3","Type":"ContainerDied","Data":"af19916cab8a292013e130272970d4bf4e0d3924b353de3349aba95e7a0ac52d"} Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.500342 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.515330 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.515351 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.515361 4760 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.515370 4760 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ce11abc-b049-4106-b431-7cb6d620a2f3-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.575573 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.583987 4760 scope.go:117] "RemoveContainer" containerID="9f8c8c5ba0d01cf0e648da859dd4514075e80f58673a32f4026e75d8efa717ef" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.601425 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.611651 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.612467 4760 scope.go:117] "RemoveContainer" containerID="851691d291f5e37354823883abe9b5439fd28670aab4d745b3d899cb068b41a0" Sep 30 07:53:15 crc kubenswrapper[4760]: E0930 07:53:15.613245 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"851691d291f5e37354823883abe9b5439fd28670aab4d745b3d899cb068b41a0\": container with ID starting with 851691d291f5e37354823883abe9b5439fd28670aab4d745b3d899cb068b41a0 not found: ID does not exist" containerID="851691d291f5e37354823883abe9b5439fd28670aab4d745b3d899cb068b41a0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.613270 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"851691d291f5e37354823883abe9b5439fd28670aab4d745b3d899cb068b41a0"} err="failed to get container status \"851691d291f5e37354823883abe9b5439fd28670aab4d745b3d899cb068b41a0\": rpc error: code = NotFound desc = could not find container \"851691d291f5e37354823883abe9b5439fd28670aab4d745b3d899cb068b41a0\": container with ID starting with 851691d291f5e37354823883abe9b5439fd28670aab4d745b3d899cb068b41a0 not found: ID does not exist" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.613291 4760 scope.go:117] "RemoveContainer" containerID="9f8c8c5ba0d01cf0e648da859dd4514075e80f58673a32f4026e75d8efa717ef" Sep 30 07:53:15 crc kubenswrapper[4760]: E0930 07:53:15.619966 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f8c8c5ba0d01cf0e648da859dd4514075e80f58673a32f4026e75d8efa717ef\": container with ID starting with 9f8c8c5ba0d01cf0e648da859dd4514075e80f58673a32f4026e75d8efa717ef not found: ID does not exist" containerID="9f8c8c5ba0d01cf0e648da859dd4514075e80f58673a32f4026e75d8efa717ef" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.620024 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f8c8c5ba0d01cf0e648da859dd4514075e80f58673a32f4026e75d8efa717ef"} err="failed to get container status \"9f8c8c5ba0d01cf0e648da859dd4514075e80f58673a32f4026e75d8efa717ef\": rpc error: code = NotFound desc = could not find container \"9f8c8c5ba0d01cf0e648da859dd4514075e80f58673a32f4026e75d8efa717ef\": container with ID starting with 9f8c8c5ba0d01cf0e648da859dd4514075e80f58673a32f4026e75d8efa717ef not found: ID does not exist" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.620057 4760 scope.go:117] "RemoveContainer" containerID="57b5645477b0f739e66f95c6f03d57d10ad43a729c03c9ec2ee15ed9600d3c68" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.624951 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.643722 4760 scope.go:117] "RemoveContainer" containerID="024f0ff9f41ca6b0b8458be611a149892d4390d5ead651580c001b06a18012bd" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.659880 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 07:53:15 crc kubenswrapper[4760]: E0930 07:53:15.660360 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ce11abc-b049-4106-b431-7cb6d620a2f3" containerName="sg-core" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.660372 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce11abc-b049-4106-b431-7cb6d620a2f3" containerName="sg-core" Sep 30 07:53:15 crc kubenswrapper[4760]: E0930 07:53:15.660383 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5411630-4844-48cd-b1be-daebd1ee3e0e" containerName="nova-api-log" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.660389 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5411630-4844-48cd-b1be-daebd1ee3e0e" containerName="nova-api-log" Sep 30 07:53:15 crc kubenswrapper[4760]: E0930 07:53:15.660404 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ce11abc-b049-4106-b431-7cb6d620a2f3" containerName="ceilometer-central-agent" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.660410 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce11abc-b049-4106-b431-7cb6d620a2f3" containerName="ceilometer-central-agent" Sep 30 07:53:15 crc kubenswrapper[4760]: E0930 07:53:15.660426 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ce11abc-b049-4106-b431-7cb6d620a2f3" containerName="ceilometer-notification-agent" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.660432 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce11abc-b049-4106-b431-7cb6d620a2f3" containerName="ceilometer-notification-agent" Sep 30 07:53:15 crc kubenswrapper[4760]: E0930 07:53:15.660454 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ce11abc-b049-4106-b431-7cb6d620a2f3" containerName="proxy-httpd" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.660459 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce11abc-b049-4106-b431-7cb6d620a2f3" containerName="proxy-httpd" Sep 30 07:53:15 crc kubenswrapper[4760]: E0930 07:53:15.660467 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5411630-4844-48cd-b1be-daebd1ee3e0e" containerName="nova-api-api" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.660472 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5411630-4844-48cd-b1be-daebd1ee3e0e" containerName="nova-api-api" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.660659 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5411630-4844-48cd-b1be-daebd1ee3e0e" containerName="nova-api-log" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.660672 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ce11abc-b049-4106-b431-7cb6d620a2f3" containerName="sg-core" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.660680 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ce11abc-b049-4106-b431-7cb6d620a2f3" containerName="ceilometer-central-agent" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.660692 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5411630-4844-48cd-b1be-daebd1ee3e0e" containerName="nova-api-api" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.660708 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ce11abc-b049-4106-b431-7cb6d620a2f3" containerName="ceilometer-notification-agent" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.660717 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ce11abc-b049-4106-b431-7cb6d620a2f3" containerName="proxy-httpd" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.661833 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.665276 4760 scope.go:117] "RemoveContainer" containerID="02ba7e93747e249be16d4aa877f2f2c1f41b767ae79a05cc139c2571fb2e0073" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.665820 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.666008 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.666059 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.668275 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.675366 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.678004 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.683612 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.683825 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.685724 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.687434 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.705324 4760 scope.go:117] "RemoveContainer" containerID="fecbb134fb073373a1423db95dff2f1c1141a7db0ca2ef3eb7873949e18f9742" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.719443 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c70743c-2be6-4c97-aaa8-fe22bd306c7d-log-httpd\") pod \"ceilometer-0\" (UID: \"5c70743c-2be6-4c97-aaa8-fe22bd306c7d\") " pod="openstack/ceilometer-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.720107 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c70743c-2be6-4c97-aaa8-fe22bd306c7d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5c70743c-2be6-4c97-aaa8-fe22bd306c7d\") " pod="openstack/ceilometer-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.720140 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c70743c-2be6-4c97-aaa8-fe22bd306c7d-run-httpd\") pod \"ceilometer-0\" (UID: \"5c70743c-2be6-4c97-aaa8-fe22bd306c7d\") " pod="openstack/ceilometer-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.720466 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf057c97-896f-4227-981e-fe789d69a119-logs\") pod \"nova-api-0\" (UID: \"cf057c97-896f-4227-981e-fe789d69a119\") " pod="openstack/nova-api-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.720572 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf057c97-896f-4227-981e-fe789d69a119-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cf057c97-896f-4227-981e-fe789d69a119\") " pod="openstack/nova-api-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.720628 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf057c97-896f-4227-981e-fe789d69a119-internal-tls-certs\") pod \"nova-api-0\" (UID: \"cf057c97-896f-4227-981e-fe789d69a119\") " pod="openstack/nova-api-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.720657 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c70743c-2be6-4c97-aaa8-fe22bd306c7d-scripts\") pod \"ceilometer-0\" (UID: \"5c70743c-2be6-4c97-aaa8-fe22bd306c7d\") " pod="openstack/ceilometer-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.720674 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpwmf\" (UniqueName: \"kubernetes.io/projected/cf057c97-896f-4227-981e-fe789d69a119-kube-api-access-dpwmf\") pod \"nova-api-0\" (UID: \"cf057c97-896f-4227-981e-fe789d69a119\") " pod="openstack/nova-api-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.720864 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf057c97-896f-4227-981e-fe789d69a119-public-tls-certs\") pod \"nova-api-0\" (UID: \"cf057c97-896f-4227-981e-fe789d69a119\") " pod="openstack/nova-api-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.720979 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c70743c-2be6-4c97-aaa8-fe22bd306c7d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5c70743c-2be6-4c97-aaa8-fe22bd306c7d\") " pod="openstack/ceilometer-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.721011 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c70743c-2be6-4c97-aaa8-fe22bd306c7d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5c70743c-2be6-4c97-aaa8-fe22bd306c7d\") " pod="openstack/ceilometer-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.721110 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c70743c-2be6-4c97-aaa8-fe22bd306c7d-config-data\") pod \"ceilometer-0\" (UID: \"5c70743c-2be6-4c97-aaa8-fe22bd306c7d\") " pod="openstack/ceilometer-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.721197 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j67st\" (UniqueName: \"kubernetes.io/projected/5c70743c-2be6-4c97-aaa8-fe22bd306c7d-kube-api-access-j67st\") pod \"ceilometer-0\" (UID: \"5c70743c-2be6-4c97-aaa8-fe22bd306c7d\") " pod="openstack/ceilometer-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.721342 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf057c97-896f-4227-981e-fe789d69a119-config-data\") pod \"nova-api-0\" (UID: \"cf057c97-896f-4227-981e-fe789d69a119\") " pod="openstack/nova-api-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.763995 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.779816 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.783196 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.783288 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.822777 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c70743c-2be6-4c97-aaa8-fe22bd306c7d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5c70743c-2be6-4c97-aaa8-fe22bd306c7d\") " pod="openstack/ceilometer-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.822825 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c70743c-2be6-4c97-aaa8-fe22bd306c7d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5c70743c-2be6-4c97-aaa8-fe22bd306c7d\") " pod="openstack/ceilometer-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.822887 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c70743c-2be6-4c97-aaa8-fe22bd306c7d-config-data\") pod \"ceilometer-0\" (UID: \"5c70743c-2be6-4c97-aaa8-fe22bd306c7d\") " pod="openstack/ceilometer-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.822925 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j67st\" (UniqueName: \"kubernetes.io/projected/5c70743c-2be6-4c97-aaa8-fe22bd306c7d-kube-api-access-j67st\") pod \"ceilometer-0\" (UID: \"5c70743c-2be6-4c97-aaa8-fe22bd306c7d\") " pod="openstack/ceilometer-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.822948 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf057c97-896f-4227-981e-fe789d69a119-config-data\") pod \"nova-api-0\" (UID: \"cf057c97-896f-4227-981e-fe789d69a119\") " pod="openstack/nova-api-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.822980 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c70743c-2be6-4c97-aaa8-fe22bd306c7d-log-httpd\") pod \"ceilometer-0\" (UID: \"5c70743c-2be6-4c97-aaa8-fe22bd306c7d\") " pod="openstack/ceilometer-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.823015 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c70743c-2be6-4c97-aaa8-fe22bd306c7d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5c70743c-2be6-4c97-aaa8-fe22bd306c7d\") " pod="openstack/ceilometer-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.823041 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c70743c-2be6-4c97-aaa8-fe22bd306c7d-run-httpd\") pod \"ceilometer-0\" (UID: \"5c70743c-2be6-4c97-aaa8-fe22bd306c7d\") " pod="openstack/ceilometer-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.823093 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf057c97-896f-4227-981e-fe789d69a119-logs\") pod \"nova-api-0\" (UID: \"cf057c97-896f-4227-981e-fe789d69a119\") " pod="openstack/nova-api-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.823167 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf057c97-896f-4227-981e-fe789d69a119-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cf057c97-896f-4227-981e-fe789d69a119\") " pod="openstack/nova-api-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.823208 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf057c97-896f-4227-981e-fe789d69a119-internal-tls-certs\") pod \"nova-api-0\" (UID: \"cf057c97-896f-4227-981e-fe789d69a119\") " pod="openstack/nova-api-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.823238 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpwmf\" (UniqueName: \"kubernetes.io/projected/cf057c97-896f-4227-981e-fe789d69a119-kube-api-access-dpwmf\") pod \"nova-api-0\" (UID: \"cf057c97-896f-4227-981e-fe789d69a119\") " pod="openstack/nova-api-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.823258 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c70743c-2be6-4c97-aaa8-fe22bd306c7d-scripts\") pod \"ceilometer-0\" (UID: \"5c70743c-2be6-4c97-aaa8-fe22bd306c7d\") " pod="openstack/ceilometer-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.823295 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf057c97-896f-4227-981e-fe789d69a119-public-tls-certs\") pod \"nova-api-0\" (UID: \"cf057c97-896f-4227-981e-fe789d69a119\") " pod="openstack/nova-api-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.824360 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c70743c-2be6-4c97-aaa8-fe22bd306c7d-run-httpd\") pod \"ceilometer-0\" (UID: \"5c70743c-2be6-4c97-aaa8-fe22bd306c7d\") " pod="openstack/ceilometer-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.826704 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf057c97-896f-4227-981e-fe789d69a119-public-tls-certs\") pod \"nova-api-0\" (UID: \"cf057c97-896f-4227-981e-fe789d69a119\") " pod="openstack/nova-api-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.827469 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf057c97-896f-4227-981e-fe789d69a119-logs\") pod \"nova-api-0\" (UID: \"cf057c97-896f-4227-981e-fe789d69a119\") " pod="openstack/nova-api-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.828044 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c70743c-2be6-4c97-aaa8-fe22bd306c7d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5c70743c-2be6-4c97-aaa8-fe22bd306c7d\") " pod="openstack/ceilometer-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.830355 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c70743c-2be6-4c97-aaa8-fe22bd306c7d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5c70743c-2be6-4c97-aaa8-fe22bd306c7d\") " pod="openstack/ceilometer-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.830690 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf057c97-896f-4227-981e-fe789d69a119-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cf057c97-896f-4227-981e-fe789d69a119\") " pod="openstack/nova-api-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.831267 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c70743c-2be6-4c97-aaa8-fe22bd306c7d-log-httpd\") pod \"ceilometer-0\" (UID: \"5c70743c-2be6-4c97-aaa8-fe22bd306c7d\") " pod="openstack/ceilometer-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.833582 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c70743c-2be6-4c97-aaa8-fe22bd306c7d-scripts\") pod \"ceilometer-0\" (UID: \"5c70743c-2be6-4c97-aaa8-fe22bd306c7d\") " pod="openstack/ceilometer-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.834356 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf057c97-896f-4227-981e-fe789d69a119-internal-tls-certs\") pod \"nova-api-0\" (UID: \"cf057c97-896f-4227-981e-fe789d69a119\") " pod="openstack/nova-api-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.835854 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c70743c-2be6-4c97-aaa8-fe22bd306c7d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5c70743c-2be6-4c97-aaa8-fe22bd306c7d\") " pod="openstack/ceilometer-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.835949 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf057c97-896f-4227-981e-fe789d69a119-config-data\") pod \"nova-api-0\" (UID: \"cf057c97-896f-4227-981e-fe789d69a119\") " pod="openstack/nova-api-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.837131 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c70743c-2be6-4c97-aaa8-fe22bd306c7d-config-data\") pod \"ceilometer-0\" (UID: \"5c70743c-2be6-4c97-aaa8-fe22bd306c7d\") " pod="openstack/ceilometer-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.846787 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j67st\" (UniqueName: \"kubernetes.io/projected/5c70743c-2be6-4c97-aaa8-fe22bd306c7d-kube-api-access-j67st\") pod \"ceilometer-0\" (UID: \"5c70743c-2be6-4c97-aaa8-fe22bd306c7d\") " pod="openstack/ceilometer-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.849960 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpwmf\" (UniqueName: \"kubernetes.io/projected/cf057c97-896f-4227-981e-fe789d69a119-kube-api-access-dpwmf\") pod \"nova-api-0\" (UID: \"cf057c97-896f-4227-981e-fe789d69a119\") " pod="openstack/nova-api-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.984765 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 07:53:15 crc kubenswrapper[4760]: I0930 07:53:15.996236 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 07:53:16 crc kubenswrapper[4760]: I0930 07:53:16.535290 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Sep 30 07:53:16 crc kubenswrapper[4760]: I0930 07:53:16.535851 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 07:53:16 crc kubenswrapper[4760]: I0930 07:53:16.648102 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 07:53:16 crc kubenswrapper[4760]: I0930 07:53:16.684096 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 07:53:16 crc kubenswrapper[4760]: I0930 07:53:16.797509 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1d3f6c3d-070f-494e-8b47-856d54de039c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.214:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 07:53:16 crc kubenswrapper[4760]: I0930 07:53:16.797509 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1d3f6c3d-070f-494e-8b47-856d54de039c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.214:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 07:53:17 crc kubenswrapper[4760]: I0930 07:53:17.019725 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-blsm9"] Sep 30 07:53:17 crc kubenswrapper[4760]: I0930 07:53:17.021340 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-blsm9" Sep 30 07:53:17 crc kubenswrapper[4760]: I0930 07:53:17.027978 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Sep 30 07:53:17 crc kubenswrapper[4760]: I0930 07:53:17.028270 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Sep 30 07:53:17 crc kubenswrapper[4760]: I0930 07:53:17.054583 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-blsm9"] Sep 30 07:53:17 crc kubenswrapper[4760]: I0930 07:53:17.056113 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chvzz\" (UniqueName: \"kubernetes.io/projected/cb3756b4-ea8a-41f4-a8db-351088780965-kube-api-access-chvzz\") pod \"nova-cell1-cell-mapping-blsm9\" (UID: \"cb3756b4-ea8a-41f4-a8db-351088780965\") " pod="openstack/nova-cell1-cell-mapping-blsm9" Sep 30 07:53:17 crc kubenswrapper[4760]: I0930 07:53:17.056182 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb3756b4-ea8a-41f4-a8db-351088780965-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-blsm9\" (UID: \"cb3756b4-ea8a-41f4-a8db-351088780965\") " pod="openstack/nova-cell1-cell-mapping-blsm9" Sep 30 07:53:17 crc kubenswrapper[4760]: I0930 07:53:17.056261 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb3756b4-ea8a-41f4-a8db-351088780965-scripts\") pod \"nova-cell1-cell-mapping-blsm9\" (UID: \"cb3756b4-ea8a-41f4-a8db-351088780965\") " pod="openstack/nova-cell1-cell-mapping-blsm9" Sep 30 07:53:17 crc kubenswrapper[4760]: I0930 07:53:17.056359 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb3756b4-ea8a-41f4-a8db-351088780965-config-data\") pod \"nova-cell1-cell-mapping-blsm9\" (UID: \"cb3756b4-ea8a-41f4-a8db-351088780965\") " pod="openstack/nova-cell1-cell-mapping-blsm9" Sep 30 07:53:17 crc kubenswrapper[4760]: I0930 07:53:17.086235 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ce11abc-b049-4106-b431-7cb6d620a2f3" path="/var/lib/kubelet/pods/0ce11abc-b049-4106-b431-7cb6d620a2f3/volumes" Sep 30 07:53:17 crc kubenswrapper[4760]: I0930 07:53:17.087962 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5411630-4844-48cd-b1be-daebd1ee3e0e" path="/var/lib/kubelet/pods/f5411630-4844-48cd-b1be-daebd1ee3e0e/volumes" Sep 30 07:53:17 crc kubenswrapper[4760]: I0930 07:53:17.158487 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb3756b4-ea8a-41f4-a8db-351088780965-scripts\") pod \"nova-cell1-cell-mapping-blsm9\" (UID: \"cb3756b4-ea8a-41f4-a8db-351088780965\") " pod="openstack/nova-cell1-cell-mapping-blsm9" Sep 30 07:53:17 crc kubenswrapper[4760]: I0930 07:53:17.158631 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb3756b4-ea8a-41f4-a8db-351088780965-config-data\") pod \"nova-cell1-cell-mapping-blsm9\" (UID: \"cb3756b4-ea8a-41f4-a8db-351088780965\") " pod="openstack/nova-cell1-cell-mapping-blsm9" Sep 30 07:53:17 crc kubenswrapper[4760]: I0930 07:53:17.158813 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chvzz\" (UniqueName: \"kubernetes.io/projected/cb3756b4-ea8a-41f4-a8db-351088780965-kube-api-access-chvzz\") pod \"nova-cell1-cell-mapping-blsm9\" (UID: \"cb3756b4-ea8a-41f4-a8db-351088780965\") " pod="openstack/nova-cell1-cell-mapping-blsm9" Sep 30 07:53:17 crc kubenswrapper[4760]: I0930 07:53:17.158884 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb3756b4-ea8a-41f4-a8db-351088780965-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-blsm9\" (UID: \"cb3756b4-ea8a-41f4-a8db-351088780965\") " pod="openstack/nova-cell1-cell-mapping-blsm9" Sep 30 07:53:17 crc kubenswrapper[4760]: I0930 07:53:17.162998 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb3756b4-ea8a-41f4-a8db-351088780965-config-data\") pod \"nova-cell1-cell-mapping-blsm9\" (UID: \"cb3756b4-ea8a-41f4-a8db-351088780965\") " pod="openstack/nova-cell1-cell-mapping-blsm9" Sep 30 07:53:17 crc kubenswrapper[4760]: I0930 07:53:17.163999 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb3756b4-ea8a-41f4-a8db-351088780965-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-blsm9\" (UID: \"cb3756b4-ea8a-41f4-a8db-351088780965\") " pod="openstack/nova-cell1-cell-mapping-blsm9" Sep 30 07:53:17 crc kubenswrapper[4760]: I0930 07:53:17.164150 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb3756b4-ea8a-41f4-a8db-351088780965-scripts\") pod \"nova-cell1-cell-mapping-blsm9\" (UID: \"cb3756b4-ea8a-41f4-a8db-351088780965\") " pod="openstack/nova-cell1-cell-mapping-blsm9" Sep 30 07:53:17 crc kubenswrapper[4760]: I0930 07:53:17.177135 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chvzz\" (UniqueName: \"kubernetes.io/projected/cb3756b4-ea8a-41f4-a8db-351088780965-kube-api-access-chvzz\") pod \"nova-cell1-cell-mapping-blsm9\" (UID: \"cb3756b4-ea8a-41f4-a8db-351088780965\") " pod="openstack/nova-cell1-cell-mapping-blsm9" Sep 30 07:53:17 crc kubenswrapper[4760]: I0930 07:53:17.343990 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-blsm9" Sep 30 07:53:17 crc kubenswrapper[4760]: I0930 07:53:17.568866 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf057c97-896f-4227-981e-fe789d69a119","Type":"ContainerStarted","Data":"39d618b7372fbcb067cdc0531416c5a050f84171d9e57ca750ac8aca8597aa8b"} Sep 30 07:53:17 crc kubenswrapper[4760]: I0930 07:53:17.568917 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf057c97-896f-4227-981e-fe789d69a119","Type":"ContainerStarted","Data":"3518b000cfdc4498222581aaea93f2afcf87cc47f121f59d651915b47477ebd7"} Sep 30 07:53:17 crc kubenswrapper[4760]: I0930 07:53:17.568931 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf057c97-896f-4227-981e-fe789d69a119","Type":"ContainerStarted","Data":"6765182e6fe0b2eb312b9c3a3c7457cf8f91060c3d38d0cadbf227d892f4c1ad"} Sep 30 07:53:17 crc kubenswrapper[4760]: I0930 07:53:17.582917 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c70743c-2be6-4c97-aaa8-fe22bd306c7d","Type":"ContainerStarted","Data":"e33e2881a0008f26616817e2bf79f5286653ab75ac34ef21d98cdc0f74f53204"} Sep 30 07:53:17 crc kubenswrapper[4760]: I0930 07:53:17.608623 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.608597477 podStartE2EDuration="2.608597477s" podCreationTimestamp="2025-09-30 07:53:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:53:17.590904305 +0000 UTC m=+1183.233810717" watchObservedRunningTime="2025-09-30 07:53:17.608597477 +0000 UTC m=+1183.251503879" Sep 30 07:53:17 crc kubenswrapper[4760]: I0930 07:53:17.840402 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-blsm9"] Sep 30 07:53:17 crc kubenswrapper[4760]: W0930 07:53:17.847946 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb3756b4_ea8a_41f4_a8db_351088780965.slice/crio-581cdc6b31f075f8448f56d7de74a1ec4fcbe262162f1ef4a7ecd30bf9d93511 WatchSource:0}: Error finding container 581cdc6b31f075f8448f56d7de74a1ec4fcbe262162f1ef4a7ecd30bf9d93511: Status 404 returned error can't find the container with id 581cdc6b31f075f8448f56d7de74a1ec4fcbe262162f1ef4a7ecd30bf9d93511 Sep 30 07:53:18 crc kubenswrapper[4760]: I0930 07:53:18.597641 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-blsm9" event={"ID":"cb3756b4-ea8a-41f4-a8db-351088780965","Type":"ContainerStarted","Data":"2cb9abb39970e150113471fc805983549a28fa77d5535df31719c6613410f635"} Sep 30 07:53:18 crc kubenswrapper[4760]: I0930 07:53:18.597971 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-blsm9" event={"ID":"cb3756b4-ea8a-41f4-a8db-351088780965","Type":"ContainerStarted","Data":"581cdc6b31f075f8448f56d7de74a1ec4fcbe262162f1ef4a7ecd30bf9d93511"} Sep 30 07:53:18 crc kubenswrapper[4760]: I0930 07:53:18.605542 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c70743c-2be6-4c97-aaa8-fe22bd306c7d","Type":"ContainerStarted","Data":"1b3eaa302342ebbe9cec960b9356f49406e6f05bc99a7cea37dd735d9b7a2dac"} Sep 30 07:53:18 crc kubenswrapper[4760]: I0930 07:53:18.617071 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-blsm9" podStartSLOduration=2.617053557 podStartE2EDuration="2.617053557s" podCreationTimestamp="2025-09-30 07:53:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:53:18.611362642 +0000 UTC m=+1184.254269054" watchObservedRunningTime="2025-09-30 07:53:18.617053557 +0000 UTC m=+1184.259959969" Sep 30 07:53:19 crc kubenswrapper[4760]: I0930 07:53:19.225219 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" Sep 30 07:53:19 crc kubenswrapper[4760]: I0930 07:53:19.280200 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-g6596"] Sep 30 07:53:19 crc kubenswrapper[4760]: I0930 07:53:19.280483 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-865f5d856f-g6596" podUID="ce9663f0-6f75-4dd2-bb60-faf428321df0" containerName="dnsmasq-dns" containerID="cri-o://891ea8af8a08e1995052ece449e6d3ad7ac43b37140fd454cb1de02f6360e8de" gracePeriod=10 Sep 30 07:53:19 crc kubenswrapper[4760]: I0930 07:53:19.301856 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-865f5d856f-g6596" podUID="ce9663f0-6f75-4dd2-bb60-faf428321df0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.206:5353: connect: connection refused" Sep 30 07:53:19 crc kubenswrapper[4760]: I0930 07:53:19.649665 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c70743c-2be6-4c97-aaa8-fe22bd306c7d","Type":"ContainerStarted","Data":"c3fa15fffaa1b928d916fcc3546c7133f870df9e02ee3d23707021c518f515de"} Sep 30 07:53:19 crc kubenswrapper[4760]: I0930 07:53:19.656875 4760 generic.go:334] "Generic (PLEG): container finished" podID="ce9663f0-6f75-4dd2-bb60-faf428321df0" containerID="891ea8af8a08e1995052ece449e6d3ad7ac43b37140fd454cb1de02f6360e8de" exitCode=0 Sep 30 07:53:19 crc kubenswrapper[4760]: I0930 07:53:19.657736 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-g6596" event={"ID":"ce9663f0-6f75-4dd2-bb60-faf428321df0","Type":"ContainerDied","Data":"891ea8af8a08e1995052ece449e6d3ad7ac43b37140fd454cb1de02f6360e8de"} Sep 30 07:53:19 crc kubenswrapper[4760]: I0930 07:53:19.948539 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-g6596" Sep 30 07:53:20 crc kubenswrapper[4760]: I0930 07:53:20.039601 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-config\") pod \"ce9663f0-6f75-4dd2-bb60-faf428321df0\" (UID: \"ce9663f0-6f75-4dd2-bb60-faf428321df0\") " Sep 30 07:53:20 crc kubenswrapper[4760]: I0930 07:53:20.039765 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-ovsdbserver-nb\") pod \"ce9663f0-6f75-4dd2-bb60-faf428321df0\" (UID: \"ce9663f0-6f75-4dd2-bb60-faf428321df0\") " Sep 30 07:53:20 crc kubenswrapper[4760]: I0930 07:53:20.039812 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-dns-svc\") pod \"ce9663f0-6f75-4dd2-bb60-faf428321df0\" (UID: \"ce9663f0-6f75-4dd2-bb60-faf428321df0\") " Sep 30 07:53:20 crc kubenswrapper[4760]: I0930 07:53:20.039917 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f24ph\" (UniqueName: \"kubernetes.io/projected/ce9663f0-6f75-4dd2-bb60-faf428321df0-kube-api-access-f24ph\") pod \"ce9663f0-6f75-4dd2-bb60-faf428321df0\" (UID: \"ce9663f0-6f75-4dd2-bb60-faf428321df0\") " Sep 30 07:53:20 crc kubenswrapper[4760]: I0930 07:53:20.040000 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-dns-swift-storage-0\") pod \"ce9663f0-6f75-4dd2-bb60-faf428321df0\" (UID: \"ce9663f0-6f75-4dd2-bb60-faf428321df0\") " Sep 30 07:53:20 crc kubenswrapper[4760]: I0930 07:53:20.040067 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-ovsdbserver-sb\") pod \"ce9663f0-6f75-4dd2-bb60-faf428321df0\" (UID: \"ce9663f0-6f75-4dd2-bb60-faf428321df0\") " Sep 30 07:53:20 crc kubenswrapper[4760]: I0930 07:53:20.048750 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce9663f0-6f75-4dd2-bb60-faf428321df0-kube-api-access-f24ph" (OuterVolumeSpecName: "kube-api-access-f24ph") pod "ce9663f0-6f75-4dd2-bb60-faf428321df0" (UID: "ce9663f0-6f75-4dd2-bb60-faf428321df0"). InnerVolumeSpecName "kube-api-access-f24ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:53:20 crc kubenswrapper[4760]: I0930 07:53:20.116775 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce9663f0-6f75-4dd2-bb60-faf428321df0" (UID: "ce9663f0-6f75-4dd2-bb60-faf428321df0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:53:20 crc kubenswrapper[4760]: I0930 07:53:20.133184 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-config" (OuterVolumeSpecName: "config") pod "ce9663f0-6f75-4dd2-bb60-faf428321df0" (UID: "ce9663f0-6f75-4dd2-bb60-faf428321df0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:53:20 crc kubenswrapper[4760]: I0930 07:53:20.142643 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ce9663f0-6f75-4dd2-bb60-faf428321df0" (UID: "ce9663f0-6f75-4dd2-bb60-faf428321df0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:53:20 crc kubenswrapper[4760]: I0930 07:53:20.145241 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:20 crc kubenswrapper[4760]: I0930 07:53:20.145273 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:20 crc kubenswrapper[4760]: I0930 07:53:20.145285 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:20 crc kubenswrapper[4760]: I0930 07:53:20.145295 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f24ph\" (UniqueName: \"kubernetes.io/projected/ce9663f0-6f75-4dd2-bb60-faf428321df0-kube-api-access-f24ph\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:20 crc kubenswrapper[4760]: I0930 07:53:20.147969 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ce9663f0-6f75-4dd2-bb60-faf428321df0" (UID: "ce9663f0-6f75-4dd2-bb60-faf428321df0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:53:20 crc kubenswrapper[4760]: I0930 07:53:20.151572 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ce9663f0-6f75-4dd2-bb60-faf428321df0" (UID: "ce9663f0-6f75-4dd2-bb60-faf428321df0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:53:20 crc kubenswrapper[4760]: I0930 07:53:20.247914 4760 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:20 crc kubenswrapper[4760]: I0930 07:53:20.248203 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce9663f0-6f75-4dd2-bb60-faf428321df0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:20 crc kubenswrapper[4760]: I0930 07:53:20.668335 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-g6596" event={"ID":"ce9663f0-6f75-4dd2-bb60-faf428321df0","Type":"ContainerDied","Data":"7c8618ed1e723e71d30c9a74423ce6a114f6d7a851e78b24819d14b1647bb0e7"} Sep 30 07:53:20 crc kubenswrapper[4760]: I0930 07:53:20.668645 4760 scope.go:117] "RemoveContainer" containerID="891ea8af8a08e1995052ece449e6d3ad7ac43b37140fd454cb1de02f6360e8de" Sep 30 07:53:20 crc kubenswrapper[4760]: I0930 07:53:20.668389 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-g6596" Sep 30 07:53:20 crc kubenswrapper[4760]: I0930 07:53:20.670645 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c70743c-2be6-4c97-aaa8-fe22bd306c7d","Type":"ContainerStarted","Data":"0b6a66e27586a9fd31a2d67d97ff136802780294e290f3c1e20b60beee63f2cf"} Sep 30 07:53:20 crc kubenswrapper[4760]: I0930 07:53:20.702871 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-g6596"] Sep 30 07:53:20 crc kubenswrapper[4760]: I0930 07:53:20.703813 4760 scope.go:117] "RemoveContainer" containerID="852c3e852888c560a155ec12eaca30e1d600594862e74176527525d120547e75" Sep 30 07:53:20 crc kubenswrapper[4760]: I0930 07:53:20.712281 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-g6596"] Sep 30 07:53:21 crc kubenswrapper[4760]: I0930 07:53:21.080897 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce9663f0-6f75-4dd2-bb60-faf428321df0" path="/var/lib/kubelet/pods/ce9663f0-6f75-4dd2-bb60-faf428321df0/volumes" Sep 30 07:53:21 crc kubenswrapper[4760]: I0930 07:53:21.682877 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c70743c-2be6-4c97-aaa8-fe22bd306c7d","Type":"ContainerStarted","Data":"ab700e1b6c01bdb1d4772225e414db7a9d6d6af7ec9df9d6a5f46dd24924657e"} Sep 30 07:53:21 crc kubenswrapper[4760]: I0930 07:53:21.683279 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 07:53:21 crc kubenswrapper[4760]: I0930 07:53:21.710492 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.530009207 podStartE2EDuration="6.710474983s" podCreationTimestamp="2025-09-30 07:53:15 +0000 UTC" firstStartedPulling="2025-09-30 07:53:16.683509105 +0000 UTC m=+1182.326415517" lastFinishedPulling="2025-09-30 07:53:20.863974881 +0000 UTC m=+1186.506881293" observedRunningTime="2025-09-30 07:53:21.706679286 +0000 UTC m=+1187.349585708" watchObservedRunningTime="2025-09-30 07:53:21.710474983 +0000 UTC m=+1187.353381395" Sep 30 07:53:23 crc kubenswrapper[4760]: I0930 07:53:23.711039 4760 generic.go:334] "Generic (PLEG): container finished" podID="cb3756b4-ea8a-41f4-a8db-351088780965" containerID="2cb9abb39970e150113471fc805983549a28fa77d5535df31719c6613410f635" exitCode=0 Sep 30 07:53:23 crc kubenswrapper[4760]: I0930 07:53:23.711456 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-blsm9" event={"ID":"cb3756b4-ea8a-41f4-a8db-351088780965","Type":"ContainerDied","Data":"2cb9abb39970e150113471fc805983549a28fa77d5535df31719c6613410f635"} Sep 30 07:53:25 crc kubenswrapper[4760]: I0930 07:53:25.147525 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-blsm9" Sep 30 07:53:25 crc kubenswrapper[4760]: I0930 07:53:25.161704 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chvzz\" (UniqueName: \"kubernetes.io/projected/cb3756b4-ea8a-41f4-a8db-351088780965-kube-api-access-chvzz\") pod \"cb3756b4-ea8a-41f4-a8db-351088780965\" (UID: \"cb3756b4-ea8a-41f4-a8db-351088780965\") " Sep 30 07:53:25 crc kubenswrapper[4760]: I0930 07:53:25.161971 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb3756b4-ea8a-41f4-a8db-351088780965-config-data\") pod \"cb3756b4-ea8a-41f4-a8db-351088780965\" (UID: \"cb3756b4-ea8a-41f4-a8db-351088780965\") " Sep 30 07:53:25 crc kubenswrapper[4760]: I0930 07:53:25.162088 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb3756b4-ea8a-41f4-a8db-351088780965-combined-ca-bundle\") pod \"cb3756b4-ea8a-41f4-a8db-351088780965\" (UID: \"cb3756b4-ea8a-41f4-a8db-351088780965\") " Sep 30 07:53:25 crc kubenswrapper[4760]: I0930 07:53:25.162166 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb3756b4-ea8a-41f4-a8db-351088780965-scripts\") pod \"cb3756b4-ea8a-41f4-a8db-351088780965\" (UID: \"cb3756b4-ea8a-41f4-a8db-351088780965\") " Sep 30 07:53:25 crc kubenswrapper[4760]: I0930 07:53:25.168771 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb3756b4-ea8a-41f4-a8db-351088780965-scripts" (OuterVolumeSpecName: "scripts") pod "cb3756b4-ea8a-41f4-a8db-351088780965" (UID: "cb3756b4-ea8a-41f4-a8db-351088780965"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:53:25 crc kubenswrapper[4760]: I0930 07:53:25.170466 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb3756b4-ea8a-41f4-a8db-351088780965-kube-api-access-chvzz" (OuterVolumeSpecName: "kube-api-access-chvzz") pod "cb3756b4-ea8a-41f4-a8db-351088780965" (UID: "cb3756b4-ea8a-41f4-a8db-351088780965"). InnerVolumeSpecName "kube-api-access-chvzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:53:25 crc kubenswrapper[4760]: I0930 07:53:25.200521 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb3756b4-ea8a-41f4-a8db-351088780965-config-data" (OuterVolumeSpecName: "config-data") pod "cb3756b4-ea8a-41f4-a8db-351088780965" (UID: "cb3756b4-ea8a-41f4-a8db-351088780965"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:53:25 crc kubenswrapper[4760]: I0930 07:53:25.202629 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb3756b4-ea8a-41f4-a8db-351088780965-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb3756b4-ea8a-41f4-a8db-351088780965" (UID: "cb3756b4-ea8a-41f4-a8db-351088780965"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:53:25 crc kubenswrapper[4760]: I0930 07:53:25.263753 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chvzz\" (UniqueName: \"kubernetes.io/projected/cb3756b4-ea8a-41f4-a8db-351088780965-kube-api-access-chvzz\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:25 crc kubenswrapper[4760]: I0930 07:53:25.263795 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb3756b4-ea8a-41f4-a8db-351088780965-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:25 crc kubenswrapper[4760]: I0930 07:53:25.263806 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb3756b4-ea8a-41f4-a8db-351088780965-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:25 crc kubenswrapper[4760]: I0930 07:53:25.263813 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb3756b4-ea8a-41f4-a8db-351088780965-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:25 crc kubenswrapper[4760]: I0930 07:53:25.739880 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-blsm9" event={"ID":"cb3756b4-ea8a-41f4-a8db-351088780965","Type":"ContainerDied","Data":"581cdc6b31f075f8448f56d7de74a1ec4fcbe262162f1ef4a7ecd30bf9d93511"} Sep 30 07:53:25 crc kubenswrapper[4760]: I0930 07:53:25.739958 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="581cdc6b31f075f8448f56d7de74a1ec4fcbe262162f1ef4a7ecd30bf9d93511" Sep 30 07:53:25 crc kubenswrapper[4760]: I0930 07:53:25.740059 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-blsm9" Sep 30 07:53:25 crc kubenswrapper[4760]: I0930 07:53:25.791743 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 07:53:25 crc kubenswrapper[4760]: I0930 07:53:25.794030 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 07:53:25 crc kubenswrapper[4760]: I0930 07:53:25.801466 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 07:53:25 crc kubenswrapper[4760]: I0930 07:53:25.920977 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 07:53:25 crc kubenswrapper[4760]: I0930 07:53:25.921492 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cf057c97-896f-4227-981e-fe789d69a119" containerName="nova-api-log" containerID="cri-o://3518b000cfdc4498222581aaea93f2afcf87cc47f121f59d651915b47477ebd7" gracePeriod=30 Sep 30 07:53:25 crc kubenswrapper[4760]: I0930 07:53:25.921558 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cf057c97-896f-4227-981e-fe789d69a119" containerName="nova-api-api" containerID="cri-o://39d618b7372fbcb067cdc0531416c5a050f84171d9e57ca750ac8aca8597aa8b" gracePeriod=30 Sep 30 07:53:25 crc kubenswrapper[4760]: I0930 07:53:25.938985 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 07:53:25 crc kubenswrapper[4760]: I0930 07:53:25.939219 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="6f8814ba-633a-440c-b866-98025a753fb1" containerName="nova-scheduler-scheduler" containerID="cri-o://04f40ee3c2eaa6a66fd5a6ed291a04d424bb7a6a2fdb158edfeb7c6e0045e8cf" gracePeriod=30 Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.017012 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 07:53:26 crc kubenswrapper[4760]: E0930 07:53:26.471929 4760 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="04f40ee3c2eaa6a66fd5a6ed291a04d424bb7a6a2fdb158edfeb7c6e0045e8cf" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 07:53:26 crc kubenswrapper[4760]: E0930 07:53:26.473730 4760 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="04f40ee3c2eaa6a66fd5a6ed291a04d424bb7a6a2fdb158edfeb7c6e0045e8cf" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 07:53:26 crc kubenswrapper[4760]: E0930 07:53:26.475051 4760 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="04f40ee3c2eaa6a66fd5a6ed291a04d424bb7a6a2fdb158edfeb7c6e0045e8cf" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 07:53:26 crc kubenswrapper[4760]: E0930 07:53:26.475085 4760 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="6f8814ba-633a-440c-b866-98025a753fb1" containerName="nova-scheduler-scheduler" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.572610 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.592173 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpwmf\" (UniqueName: \"kubernetes.io/projected/cf057c97-896f-4227-981e-fe789d69a119-kube-api-access-dpwmf\") pod \"cf057c97-896f-4227-981e-fe789d69a119\" (UID: \"cf057c97-896f-4227-981e-fe789d69a119\") " Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.592267 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf057c97-896f-4227-981e-fe789d69a119-combined-ca-bundle\") pod \"cf057c97-896f-4227-981e-fe789d69a119\" (UID: \"cf057c97-896f-4227-981e-fe789d69a119\") " Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.592458 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf057c97-896f-4227-981e-fe789d69a119-internal-tls-certs\") pod \"cf057c97-896f-4227-981e-fe789d69a119\" (UID: \"cf057c97-896f-4227-981e-fe789d69a119\") " Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.592565 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf057c97-896f-4227-981e-fe789d69a119-logs\") pod \"cf057c97-896f-4227-981e-fe789d69a119\" (UID: \"cf057c97-896f-4227-981e-fe789d69a119\") " Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.592610 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf057c97-896f-4227-981e-fe789d69a119-public-tls-certs\") pod \"cf057c97-896f-4227-981e-fe789d69a119\" (UID: \"cf057c97-896f-4227-981e-fe789d69a119\") " Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.592677 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf057c97-896f-4227-981e-fe789d69a119-config-data\") pod \"cf057c97-896f-4227-981e-fe789d69a119\" (UID: \"cf057c97-896f-4227-981e-fe789d69a119\") " Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.595026 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf057c97-896f-4227-981e-fe789d69a119-logs" (OuterVolumeSpecName: "logs") pod "cf057c97-896f-4227-981e-fe789d69a119" (UID: "cf057c97-896f-4227-981e-fe789d69a119"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.607491 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf057c97-896f-4227-981e-fe789d69a119-kube-api-access-dpwmf" (OuterVolumeSpecName: "kube-api-access-dpwmf") pod "cf057c97-896f-4227-981e-fe789d69a119" (UID: "cf057c97-896f-4227-981e-fe789d69a119"). InnerVolumeSpecName "kube-api-access-dpwmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.620809 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf057c97-896f-4227-981e-fe789d69a119-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf057c97-896f-4227-981e-fe789d69a119" (UID: "cf057c97-896f-4227-981e-fe789d69a119"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.636232 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf057c97-896f-4227-981e-fe789d69a119-config-data" (OuterVolumeSpecName: "config-data") pod "cf057c97-896f-4227-981e-fe789d69a119" (UID: "cf057c97-896f-4227-981e-fe789d69a119"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.670826 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf057c97-896f-4227-981e-fe789d69a119-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cf057c97-896f-4227-981e-fe789d69a119" (UID: "cf057c97-896f-4227-981e-fe789d69a119"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.679384 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf057c97-896f-4227-981e-fe789d69a119-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cf057c97-896f-4227-981e-fe789d69a119" (UID: "cf057c97-896f-4227-981e-fe789d69a119"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.695147 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf057c97-896f-4227-981e-fe789d69a119-logs\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.695177 4760 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf057c97-896f-4227-981e-fe789d69a119-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.695207 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf057c97-896f-4227-981e-fe789d69a119-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.695218 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpwmf\" (UniqueName: \"kubernetes.io/projected/cf057c97-896f-4227-981e-fe789d69a119-kube-api-access-dpwmf\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.695228 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf057c97-896f-4227-981e-fe789d69a119-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.695235 4760 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf057c97-896f-4227-981e-fe789d69a119-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.749814 4760 generic.go:334] "Generic (PLEG): container finished" podID="cf057c97-896f-4227-981e-fe789d69a119" containerID="39d618b7372fbcb067cdc0531416c5a050f84171d9e57ca750ac8aca8597aa8b" exitCode=0 Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.749840 4760 generic.go:334] "Generic (PLEG): container finished" podID="cf057c97-896f-4227-981e-fe789d69a119" containerID="3518b000cfdc4498222581aaea93f2afcf87cc47f121f59d651915b47477ebd7" exitCode=143 Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.750899 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.751597 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf057c97-896f-4227-981e-fe789d69a119","Type":"ContainerDied","Data":"39d618b7372fbcb067cdc0531416c5a050f84171d9e57ca750ac8aca8597aa8b"} Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.751652 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf057c97-896f-4227-981e-fe789d69a119","Type":"ContainerDied","Data":"3518b000cfdc4498222581aaea93f2afcf87cc47f121f59d651915b47477ebd7"} Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.751661 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf057c97-896f-4227-981e-fe789d69a119","Type":"ContainerDied","Data":"6765182e6fe0b2eb312b9c3a3c7457cf8f91060c3d38d0cadbf227d892f4c1ad"} Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.751689 4760 scope.go:117] "RemoveContainer" containerID="39d618b7372fbcb067cdc0531416c5a050f84171d9e57ca750ac8aca8597aa8b" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.755455 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.817476 4760 scope.go:117] "RemoveContainer" containerID="3518b000cfdc4498222581aaea93f2afcf87cc47f121f59d651915b47477ebd7" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.817620 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.846363 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.874768 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 07:53:26 crc kubenswrapper[4760]: E0930 07:53:26.875238 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf057c97-896f-4227-981e-fe789d69a119" containerName="nova-api-log" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.875252 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf057c97-896f-4227-981e-fe789d69a119" containerName="nova-api-log" Sep 30 07:53:26 crc kubenswrapper[4760]: E0930 07:53:26.875291 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf057c97-896f-4227-981e-fe789d69a119" containerName="nova-api-api" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.875320 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf057c97-896f-4227-981e-fe789d69a119" containerName="nova-api-api" Sep 30 07:53:26 crc kubenswrapper[4760]: E0930 07:53:26.875352 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce9663f0-6f75-4dd2-bb60-faf428321df0" containerName="init" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.875362 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce9663f0-6f75-4dd2-bb60-faf428321df0" containerName="init" Sep 30 07:53:26 crc kubenswrapper[4760]: E0930 07:53:26.875383 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce9663f0-6f75-4dd2-bb60-faf428321df0" containerName="dnsmasq-dns" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.875392 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce9663f0-6f75-4dd2-bb60-faf428321df0" containerName="dnsmasq-dns" Sep 30 07:53:26 crc kubenswrapper[4760]: E0930 07:53:26.875411 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb3756b4-ea8a-41f4-a8db-351088780965" containerName="nova-manage" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.875420 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb3756b4-ea8a-41f4-a8db-351088780965" containerName="nova-manage" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.875651 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb3756b4-ea8a-41f4-a8db-351088780965" containerName="nova-manage" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.875667 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce9663f0-6f75-4dd2-bb60-faf428321df0" containerName="dnsmasq-dns" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.875687 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf057c97-896f-4227-981e-fe789d69a119" containerName="nova-api-api" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.875709 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf057c97-896f-4227-981e-fe789d69a119" containerName="nova-api-log" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.877031 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.888382 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.889829 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.890155 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.890409 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.892925 4760 scope.go:117] "RemoveContainer" containerID="39d618b7372fbcb067cdc0531416c5a050f84171d9e57ca750ac8aca8597aa8b" Sep 30 07:53:26 crc kubenswrapper[4760]: E0930 07:53:26.901729 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39d618b7372fbcb067cdc0531416c5a050f84171d9e57ca750ac8aca8597aa8b\": container with ID starting with 39d618b7372fbcb067cdc0531416c5a050f84171d9e57ca750ac8aca8597aa8b not found: ID does not exist" containerID="39d618b7372fbcb067cdc0531416c5a050f84171d9e57ca750ac8aca8597aa8b" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.901768 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39d618b7372fbcb067cdc0531416c5a050f84171d9e57ca750ac8aca8597aa8b"} err="failed to get container status \"39d618b7372fbcb067cdc0531416c5a050f84171d9e57ca750ac8aca8597aa8b\": rpc error: code = NotFound desc = could not find container \"39d618b7372fbcb067cdc0531416c5a050f84171d9e57ca750ac8aca8597aa8b\": container with ID starting with 39d618b7372fbcb067cdc0531416c5a050f84171d9e57ca750ac8aca8597aa8b not found: ID does not exist" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.901792 4760 scope.go:117] "RemoveContainer" containerID="3518b000cfdc4498222581aaea93f2afcf87cc47f121f59d651915b47477ebd7" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.903613 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c27b43ce-27fb-4163-b55a-98a7e9ee7d71-public-tls-certs\") pod \"nova-api-0\" (UID: \"c27b43ce-27fb-4163-b55a-98a7e9ee7d71\") " pod="openstack/nova-api-0" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.903667 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c27b43ce-27fb-4163-b55a-98a7e9ee7d71-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c27b43ce-27fb-4163-b55a-98a7e9ee7d71\") " pod="openstack/nova-api-0" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.903705 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c27b43ce-27fb-4163-b55a-98a7e9ee7d71-config-data\") pod \"nova-api-0\" (UID: \"c27b43ce-27fb-4163-b55a-98a7e9ee7d71\") " pod="openstack/nova-api-0" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.903746 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c27b43ce-27fb-4163-b55a-98a7e9ee7d71-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c27b43ce-27fb-4163-b55a-98a7e9ee7d71\") " pod="openstack/nova-api-0" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.903799 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7ddc\" (UniqueName: \"kubernetes.io/projected/c27b43ce-27fb-4163-b55a-98a7e9ee7d71-kube-api-access-w7ddc\") pod \"nova-api-0\" (UID: \"c27b43ce-27fb-4163-b55a-98a7e9ee7d71\") " pod="openstack/nova-api-0" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.903854 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c27b43ce-27fb-4163-b55a-98a7e9ee7d71-logs\") pod \"nova-api-0\" (UID: \"c27b43ce-27fb-4163-b55a-98a7e9ee7d71\") " pod="openstack/nova-api-0" Sep 30 07:53:26 crc kubenswrapper[4760]: E0930 07:53:26.922212 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3518b000cfdc4498222581aaea93f2afcf87cc47f121f59d651915b47477ebd7\": container with ID starting with 3518b000cfdc4498222581aaea93f2afcf87cc47f121f59d651915b47477ebd7 not found: ID does not exist" containerID="3518b000cfdc4498222581aaea93f2afcf87cc47f121f59d651915b47477ebd7" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.922254 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3518b000cfdc4498222581aaea93f2afcf87cc47f121f59d651915b47477ebd7"} err="failed to get container status \"3518b000cfdc4498222581aaea93f2afcf87cc47f121f59d651915b47477ebd7\": rpc error: code = NotFound desc = could not find container \"3518b000cfdc4498222581aaea93f2afcf87cc47f121f59d651915b47477ebd7\": container with ID starting with 3518b000cfdc4498222581aaea93f2afcf87cc47f121f59d651915b47477ebd7 not found: ID does not exist" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.922295 4760 scope.go:117] "RemoveContainer" containerID="39d618b7372fbcb067cdc0531416c5a050f84171d9e57ca750ac8aca8597aa8b" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.925598 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39d618b7372fbcb067cdc0531416c5a050f84171d9e57ca750ac8aca8597aa8b"} err="failed to get container status \"39d618b7372fbcb067cdc0531416c5a050f84171d9e57ca750ac8aca8597aa8b\": rpc error: code = NotFound desc = could not find container \"39d618b7372fbcb067cdc0531416c5a050f84171d9e57ca750ac8aca8597aa8b\": container with ID starting with 39d618b7372fbcb067cdc0531416c5a050f84171d9e57ca750ac8aca8597aa8b not found: ID does not exist" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.925814 4760 scope.go:117] "RemoveContainer" containerID="3518b000cfdc4498222581aaea93f2afcf87cc47f121f59d651915b47477ebd7" Sep 30 07:53:26 crc kubenswrapper[4760]: I0930 07:53:26.932383 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3518b000cfdc4498222581aaea93f2afcf87cc47f121f59d651915b47477ebd7"} err="failed to get container status \"3518b000cfdc4498222581aaea93f2afcf87cc47f121f59d651915b47477ebd7\": rpc error: code = NotFound desc = could not find container \"3518b000cfdc4498222581aaea93f2afcf87cc47f121f59d651915b47477ebd7\": container with ID starting with 3518b000cfdc4498222581aaea93f2afcf87cc47f121f59d651915b47477ebd7 not found: ID does not exist" Sep 30 07:53:27 crc kubenswrapper[4760]: I0930 07:53:27.013461 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c27b43ce-27fb-4163-b55a-98a7e9ee7d71-public-tls-certs\") pod \"nova-api-0\" (UID: \"c27b43ce-27fb-4163-b55a-98a7e9ee7d71\") " pod="openstack/nova-api-0" Sep 30 07:53:27 crc kubenswrapper[4760]: I0930 07:53:27.013519 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c27b43ce-27fb-4163-b55a-98a7e9ee7d71-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c27b43ce-27fb-4163-b55a-98a7e9ee7d71\") " pod="openstack/nova-api-0" Sep 30 07:53:27 crc kubenswrapper[4760]: I0930 07:53:27.013548 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c27b43ce-27fb-4163-b55a-98a7e9ee7d71-config-data\") pod \"nova-api-0\" (UID: \"c27b43ce-27fb-4163-b55a-98a7e9ee7d71\") " pod="openstack/nova-api-0" Sep 30 07:53:27 crc kubenswrapper[4760]: I0930 07:53:27.013580 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c27b43ce-27fb-4163-b55a-98a7e9ee7d71-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c27b43ce-27fb-4163-b55a-98a7e9ee7d71\") " pod="openstack/nova-api-0" Sep 30 07:53:27 crc kubenswrapper[4760]: I0930 07:53:27.013617 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7ddc\" (UniqueName: \"kubernetes.io/projected/c27b43ce-27fb-4163-b55a-98a7e9ee7d71-kube-api-access-w7ddc\") pod \"nova-api-0\" (UID: \"c27b43ce-27fb-4163-b55a-98a7e9ee7d71\") " pod="openstack/nova-api-0" Sep 30 07:53:27 crc kubenswrapper[4760]: I0930 07:53:27.013656 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c27b43ce-27fb-4163-b55a-98a7e9ee7d71-logs\") pod \"nova-api-0\" (UID: \"c27b43ce-27fb-4163-b55a-98a7e9ee7d71\") " pod="openstack/nova-api-0" Sep 30 07:53:27 crc kubenswrapper[4760]: I0930 07:53:27.014112 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c27b43ce-27fb-4163-b55a-98a7e9ee7d71-logs\") pod \"nova-api-0\" (UID: \"c27b43ce-27fb-4163-b55a-98a7e9ee7d71\") " pod="openstack/nova-api-0" Sep 30 07:53:27 crc kubenswrapper[4760]: I0930 07:53:27.020769 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c27b43ce-27fb-4163-b55a-98a7e9ee7d71-config-data\") pod \"nova-api-0\" (UID: \"c27b43ce-27fb-4163-b55a-98a7e9ee7d71\") " pod="openstack/nova-api-0" Sep 30 07:53:27 crc kubenswrapper[4760]: I0930 07:53:27.032882 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c27b43ce-27fb-4163-b55a-98a7e9ee7d71-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c27b43ce-27fb-4163-b55a-98a7e9ee7d71\") " pod="openstack/nova-api-0" Sep 30 07:53:27 crc kubenswrapper[4760]: I0930 07:53:27.038874 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c27b43ce-27fb-4163-b55a-98a7e9ee7d71-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c27b43ce-27fb-4163-b55a-98a7e9ee7d71\") " pod="openstack/nova-api-0" Sep 30 07:53:27 crc kubenswrapper[4760]: I0930 07:53:27.039816 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c27b43ce-27fb-4163-b55a-98a7e9ee7d71-public-tls-certs\") pod \"nova-api-0\" (UID: \"c27b43ce-27fb-4163-b55a-98a7e9ee7d71\") " pod="openstack/nova-api-0" Sep 30 07:53:27 crc kubenswrapper[4760]: I0930 07:53:27.054273 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7ddc\" (UniqueName: \"kubernetes.io/projected/c27b43ce-27fb-4163-b55a-98a7e9ee7d71-kube-api-access-w7ddc\") pod \"nova-api-0\" (UID: \"c27b43ce-27fb-4163-b55a-98a7e9ee7d71\") " pod="openstack/nova-api-0" Sep 30 07:53:27 crc kubenswrapper[4760]: I0930 07:53:27.080900 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf057c97-896f-4227-981e-fe789d69a119" path="/var/lib/kubelet/pods/cf057c97-896f-4227-981e-fe789d69a119/volumes" Sep 30 07:53:27 crc kubenswrapper[4760]: I0930 07:53:27.206513 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 07:53:27 crc kubenswrapper[4760]: I0930 07:53:27.710777 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 07:53:27 crc kubenswrapper[4760]: W0930 07:53:27.717160 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc27b43ce_27fb_4163_b55a_98a7e9ee7d71.slice/crio-be523f6bdef3684e07716a1a486dc305a9f01a510ce3325a6fdf37a72a4af90e WatchSource:0}: Error finding container be523f6bdef3684e07716a1a486dc305a9f01a510ce3325a6fdf37a72a4af90e: Status 404 returned error can't find the container with id be523f6bdef3684e07716a1a486dc305a9f01a510ce3325a6fdf37a72a4af90e Sep 30 07:53:27 crc kubenswrapper[4760]: I0930 07:53:27.759223 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c27b43ce-27fb-4163-b55a-98a7e9ee7d71","Type":"ContainerStarted","Data":"be523f6bdef3684e07716a1a486dc305a9f01a510ce3325a6fdf37a72a4af90e"} Sep 30 07:53:27 crc kubenswrapper[4760]: I0930 07:53:27.760537 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1d3f6c3d-070f-494e-8b47-856d54de039c" containerName="nova-metadata-log" containerID="cri-o://096ff25098794e61c05471779e2b1da5def0724c467733da81ab266ab873f450" gracePeriod=30 Sep 30 07:53:27 crc kubenswrapper[4760]: I0930 07:53:27.760664 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1d3f6c3d-070f-494e-8b47-856d54de039c" containerName="nova-metadata-metadata" containerID="cri-o://827afd0fe268d79933117c8266e31aabbecdf70bc47cc8ed1d11a50e306fd65f" gracePeriod=30 Sep 30 07:53:28 crc kubenswrapper[4760]: I0930 07:53:28.774240 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c27b43ce-27fb-4163-b55a-98a7e9ee7d71","Type":"ContainerStarted","Data":"978f85a80eeafae214ffb76063956ac02831b175ac9b646a52f1f358ad29ea17"} Sep 30 07:53:28 crc kubenswrapper[4760]: I0930 07:53:28.776082 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c27b43ce-27fb-4163-b55a-98a7e9ee7d71","Type":"ContainerStarted","Data":"14fd0cc08d844badbfda448ceb16587611a8079daeb2a46a4f7dd7ccaed54a8a"} Sep 30 07:53:28 crc kubenswrapper[4760]: I0930 07:53:28.776376 4760 generic.go:334] "Generic (PLEG): container finished" podID="1d3f6c3d-070f-494e-8b47-856d54de039c" containerID="096ff25098794e61c05471779e2b1da5def0724c467733da81ab266ab873f450" exitCode=143 Sep 30 07:53:28 crc kubenswrapper[4760]: I0930 07:53:28.776433 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1d3f6c3d-070f-494e-8b47-856d54de039c","Type":"ContainerDied","Data":"096ff25098794e61c05471779e2b1da5def0724c467733da81ab266ab873f450"} Sep 30 07:53:28 crc kubenswrapper[4760]: I0930 07:53:28.815082 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.815055167 podStartE2EDuration="2.815055167s" podCreationTimestamp="2025-09-30 07:53:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:53:28.797089158 +0000 UTC m=+1194.439995580" watchObservedRunningTime="2025-09-30 07:53:28.815055167 +0000 UTC m=+1194.457961589" Sep 30 07:53:30 crc kubenswrapper[4760]: I0930 07:53:30.741280 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 07:53:30 crc kubenswrapper[4760]: I0930 07:53:30.809688 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f8814ba-633a-440c-b866-98025a753fb1-config-data\") pod \"6f8814ba-633a-440c-b866-98025a753fb1\" (UID: \"6f8814ba-633a-440c-b866-98025a753fb1\") " Sep 30 07:53:30 crc kubenswrapper[4760]: I0930 07:53:30.809954 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mvc2\" (UniqueName: \"kubernetes.io/projected/6f8814ba-633a-440c-b866-98025a753fb1-kube-api-access-8mvc2\") pod \"6f8814ba-633a-440c-b866-98025a753fb1\" (UID: \"6f8814ba-633a-440c-b866-98025a753fb1\") " Sep 30 07:53:30 crc kubenswrapper[4760]: I0930 07:53:30.810030 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f8814ba-633a-440c-b866-98025a753fb1-combined-ca-bundle\") pod \"6f8814ba-633a-440c-b866-98025a753fb1\" (UID: \"6f8814ba-633a-440c-b866-98025a753fb1\") " Sep 30 07:53:30 crc kubenswrapper[4760]: I0930 07:53:30.825750 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f8814ba-633a-440c-b866-98025a753fb1-kube-api-access-8mvc2" (OuterVolumeSpecName: "kube-api-access-8mvc2") pod "6f8814ba-633a-440c-b866-98025a753fb1" (UID: "6f8814ba-633a-440c-b866-98025a753fb1"). InnerVolumeSpecName "kube-api-access-8mvc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:53:30 crc kubenswrapper[4760]: I0930 07:53:30.833320 4760 generic.go:334] "Generic (PLEG): container finished" podID="6f8814ba-633a-440c-b866-98025a753fb1" containerID="04f40ee3c2eaa6a66fd5a6ed291a04d424bb7a6a2fdb158edfeb7c6e0045e8cf" exitCode=0 Sep 30 07:53:30 crc kubenswrapper[4760]: I0930 07:53:30.833380 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6f8814ba-633a-440c-b866-98025a753fb1","Type":"ContainerDied","Data":"04f40ee3c2eaa6a66fd5a6ed291a04d424bb7a6a2fdb158edfeb7c6e0045e8cf"} Sep 30 07:53:30 crc kubenswrapper[4760]: I0930 07:53:30.833416 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6f8814ba-633a-440c-b866-98025a753fb1","Type":"ContainerDied","Data":"2217fe83f9b931e95eb76f4ac62657717ded36c809318295414b722ce58d630e"} Sep 30 07:53:30 crc kubenswrapper[4760]: I0930 07:53:30.833435 4760 scope.go:117] "RemoveContainer" containerID="04f40ee3c2eaa6a66fd5a6ed291a04d424bb7a6a2fdb158edfeb7c6e0045e8cf" Sep 30 07:53:30 crc kubenswrapper[4760]: I0930 07:53:30.833617 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 07:53:30 crc kubenswrapper[4760]: I0930 07:53:30.852837 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f8814ba-633a-440c-b866-98025a753fb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f8814ba-633a-440c-b866-98025a753fb1" (UID: "6f8814ba-633a-440c-b866-98025a753fb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:53:30 crc kubenswrapper[4760]: I0930 07:53:30.860551 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f8814ba-633a-440c-b866-98025a753fb1-config-data" (OuterVolumeSpecName: "config-data") pod "6f8814ba-633a-440c-b866-98025a753fb1" (UID: "6f8814ba-633a-440c-b866-98025a753fb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:53:30 crc kubenswrapper[4760]: I0930 07:53:30.894285 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="1d3f6c3d-070f-494e-8b47-856d54de039c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.214:8775/\": read tcp 10.217.0.2:46716->10.217.0.214:8775: read: connection reset by peer" Sep 30 07:53:30 crc kubenswrapper[4760]: I0930 07:53:30.894289 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="1d3f6c3d-070f-494e-8b47-856d54de039c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.214:8775/\": read tcp 10.217.0.2:46700->10.217.0.214:8775: read: connection reset by peer" Sep 30 07:53:30 crc kubenswrapper[4760]: I0930 07:53:30.912317 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f8814ba-633a-440c-b866-98025a753fb1-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:30 crc kubenswrapper[4760]: I0930 07:53:30.912357 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mvc2\" (UniqueName: \"kubernetes.io/projected/6f8814ba-633a-440c-b866-98025a753fb1-kube-api-access-8mvc2\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:30 crc kubenswrapper[4760]: I0930 07:53:30.912370 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f8814ba-633a-440c-b866-98025a753fb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:30 crc kubenswrapper[4760]: I0930 07:53:30.924488 4760 scope.go:117] "RemoveContainer" containerID="04f40ee3c2eaa6a66fd5a6ed291a04d424bb7a6a2fdb158edfeb7c6e0045e8cf" Sep 30 07:53:30 crc kubenswrapper[4760]: E0930 07:53:30.925219 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04f40ee3c2eaa6a66fd5a6ed291a04d424bb7a6a2fdb158edfeb7c6e0045e8cf\": container with ID starting with 04f40ee3c2eaa6a66fd5a6ed291a04d424bb7a6a2fdb158edfeb7c6e0045e8cf not found: ID does not exist" containerID="04f40ee3c2eaa6a66fd5a6ed291a04d424bb7a6a2fdb158edfeb7c6e0045e8cf" Sep 30 07:53:30 crc kubenswrapper[4760]: I0930 07:53:30.925249 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04f40ee3c2eaa6a66fd5a6ed291a04d424bb7a6a2fdb158edfeb7c6e0045e8cf"} err="failed to get container status \"04f40ee3c2eaa6a66fd5a6ed291a04d424bb7a6a2fdb158edfeb7c6e0045e8cf\": rpc error: code = NotFound desc = could not find container \"04f40ee3c2eaa6a66fd5a6ed291a04d424bb7a6a2fdb158edfeb7c6e0045e8cf\": container with ID starting with 04f40ee3c2eaa6a66fd5a6ed291a04d424bb7a6a2fdb158edfeb7c6e0045e8cf not found: ID does not exist" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.172380 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.226976 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.238463 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 07:53:31 crc kubenswrapper[4760]: E0930 07:53:31.238985 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f8814ba-633a-440c-b866-98025a753fb1" containerName="nova-scheduler-scheduler" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.239011 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f8814ba-633a-440c-b866-98025a753fb1" containerName="nova-scheduler-scheduler" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.239271 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f8814ba-633a-440c-b866-98025a753fb1" containerName="nova-scheduler-scheduler" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.240375 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.242452 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.250112 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.327170 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.327219 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nwd2\" (UniqueName: \"kubernetes.io/projected/868868bd-3879-4d24-9dd1-62218a15844c-kube-api-access-6nwd2\") pod \"nova-scheduler-0\" (UID: \"868868bd-3879-4d24-9dd1-62218a15844c\") " pod="openstack/nova-scheduler-0" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.327533 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/868868bd-3879-4d24-9dd1-62218a15844c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"868868bd-3879-4d24-9dd1-62218a15844c\") " pod="openstack/nova-scheduler-0" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.327608 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/868868bd-3879-4d24-9dd1-62218a15844c-config-data\") pod \"nova-scheduler-0\" (UID: \"868868bd-3879-4d24-9dd1-62218a15844c\") " pod="openstack/nova-scheduler-0" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.429132 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d3f6c3d-070f-494e-8b47-856d54de039c-nova-metadata-tls-certs\") pod \"1d3f6c3d-070f-494e-8b47-856d54de039c\" (UID: \"1d3f6c3d-070f-494e-8b47-856d54de039c\") " Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.429293 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d3f6c3d-070f-494e-8b47-856d54de039c-config-data\") pod \"1d3f6c3d-070f-494e-8b47-856d54de039c\" (UID: \"1d3f6c3d-070f-494e-8b47-856d54de039c\") " Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.429377 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d3f6c3d-070f-494e-8b47-856d54de039c-combined-ca-bundle\") pod \"1d3f6c3d-070f-494e-8b47-856d54de039c\" (UID: \"1d3f6c3d-070f-494e-8b47-856d54de039c\") " Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.429406 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d3f6c3d-070f-494e-8b47-856d54de039c-logs\") pod \"1d3f6c3d-070f-494e-8b47-856d54de039c\" (UID: \"1d3f6c3d-070f-494e-8b47-856d54de039c\") " Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.429481 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ffm4\" (UniqueName: \"kubernetes.io/projected/1d3f6c3d-070f-494e-8b47-856d54de039c-kube-api-access-8ffm4\") pod \"1d3f6c3d-070f-494e-8b47-856d54de039c\" (UID: \"1d3f6c3d-070f-494e-8b47-856d54de039c\") " Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.429791 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/868868bd-3879-4d24-9dd1-62218a15844c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"868868bd-3879-4d24-9dd1-62218a15844c\") " pod="openstack/nova-scheduler-0" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.429849 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/868868bd-3879-4d24-9dd1-62218a15844c-config-data\") pod \"nova-scheduler-0\" (UID: \"868868bd-3879-4d24-9dd1-62218a15844c\") " pod="openstack/nova-scheduler-0" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.429983 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nwd2\" (UniqueName: \"kubernetes.io/projected/868868bd-3879-4d24-9dd1-62218a15844c-kube-api-access-6nwd2\") pod \"nova-scheduler-0\" (UID: \"868868bd-3879-4d24-9dd1-62218a15844c\") " pod="openstack/nova-scheduler-0" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.430921 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d3f6c3d-070f-494e-8b47-856d54de039c-logs" (OuterVolumeSpecName: "logs") pod "1d3f6c3d-070f-494e-8b47-856d54de039c" (UID: "1d3f6c3d-070f-494e-8b47-856d54de039c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.448275 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/868868bd-3879-4d24-9dd1-62218a15844c-config-data\") pod \"nova-scheduler-0\" (UID: \"868868bd-3879-4d24-9dd1-62218a15844c\") " pod="openstack/nova-scheduler-0" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.448340 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/868868bd-3879-4d24-9dd1-62218a15844c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"868868bd-3879-4d24-9dd1-62218a15844c\") " pod="openstack/nova-scheduler-0" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.472585 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d3f6c3d-070f-494e-8b47-856d54de039c-kube-api-access-8ffm4" (OuterVolumeSpecName: "kube-api-access-8ffm4") pod "1d3f6c3d-070f-494e-8b47-856d54de039c" (UID: "1d3f6c3d-070f-494e-8b47-856d54de039c"). InnerVolumeSpecName "kube-api-access-8ffm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.522353 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nwd2\" (UniqueName: \"kubernetes.io/projected/868868bd-3879-4d24-9dd1-62218a15844c-kube-api-access-6nwd2\") pod \"nova-scheduler-0\" (UID: \"868868bd-3879-4d24-9dd1-62218a15844c\") " pod="openstack/nova-scheduler-0" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.542887 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d3f6c3d-070f-494e-8b47-856d54de039c-logs\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.543108 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ffm4\" (UniqueName: \"kubernetes.io/projected/1d3f6c3d-070f-494e-8b47-856d54de039c-kube-api-access-8ffm4\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.555130 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d3f6c3d-070f-494e-8b47-856d54de039c-config-data" (OuterVolumeSpecName: "config-data") pod "1d3f6c3d-070f-494e-8b47-856d54de039c" (UID: "1d3f6c3d-070f-494e-8b47-856d54de039c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.567744 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.633003 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d3f6c3d-070f-494e-8b47-856d54de039c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d3f6c3d-070f-494e-8b47-856d54de039c" (UID: "1d3f6c3d-070f-494e-8b47-856d54de039c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.638491 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d3f6c3d-070f-494e-8b47-856d54de039c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "1d3f6c3d-070f-494e-8b47-856d54de039c" (UID: "1d3f6c3d-070f-494e-8b47-856d54de039c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.650408 4760 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d3f6c3d-070f-494e-8b47-856d54de039c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.650443 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d3f6c3d-070f-494e-8b47-856d54de039c-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.650453 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d3f6c3d-070f-494e-8b47-856d54de039c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.849355 4760 generic.go:334] "Generic (PLEG): container finished" podID="1d3f6c3d-070f-494e-8b47-856d54de039c" containerID="827afd0fe268d79933117c8266e31aabbecdf70bc47cc8ed1d11a50e306fd65f" exitCode=0 Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.849441 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.849452 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1d3f6c3d-070f-494e-8b47-856d54de039c","Type":"ContainerDied","Data":"827afd0fe268d79933117c8266e31aabbecdf70bc47cc8ed1d11a50e306fd65f"} Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.849476 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1d3f6c3d-070f-494e-8b47-856d54de039c","Type":"ContainerDied","Data":"c3405b9dcfc01a70300d6bdf8a1d5458b69a7cf40cc3f07434b140729c5d2245"} Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.849491 4760 scope.go:117] "RemoveContainer" containerID="827afd0fe268d79933117c8266e31aabbecdf70bc47cc8ed1d11a50e306fd65f" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.905160 4760 scope.go:117] "RemoveContainer" containerID="096ff25098794e61c05471779e2b1da5def0724c467733da81ab266ab873f450" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.909311 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.933475 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.948211 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 07:53:31 crc kubenswrapper[4760]: E0930 07:53:31.949462 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3f6c3d-070f-494e-8b47-856d54de039c" containerName="nova-metadata-log" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.949486 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3f6c3d-070f-494e-8b47-856d54de039c" containerName="nova-metadata-log" Sep 30 07:53:31 crc kubenswrapper[4760]: E0930 07:53:31.949518 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3f6c3d-070f-494e-8b47-856d54de039c" containerName="nova-metadata-metadata" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.949526 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3f6c3d-070f-494e-8b47-856d54de039c" containerName="nova-metadata-metadata" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.949695 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3f6c3d-070f-494e-8b47-856d54de039c" containerName="nova-metadata-metadata" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.949717 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3f6c3d-070f-494e-8b47-856d54de039c" containerName="nova-metadata-log" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.950911 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.954752 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.955020 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.962827 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.980121 4760 scope.go:117] "RemoveContainer" containerID="827afd0fe268d79933117c8266e31aabbecdf70bc47cc8ed1d11a50e306fd65f" Sep 30 07:53:31 crc kubenswrapper[4760]: E0930 07:53:31.984665 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"827afd0fe268d79933117c8266e31aabbecdf70bc47cc8ed1d11a50e306fd65f\": container with ID starting with 827afd0fe268d79933117c8266e31aabbecdf70bc47cc8ed1d11a50e306fd65f not found: ID does not exist" containerID="827afd0fe268d79933117c8266e31aabbecdf70bc47cc8ed1d11a50e306fd65f" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.984717 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"827afd0fe268d79933117c8266e31aabbecdf70bc47cc8ed1d11a50e306fd65f"} err="failed to get container status \"827afd0fe268d79933117c8266e31aabbecdf70bc47cc8ed1d11a50e306fd65f\": rpc error: code = NotFound desc = could not find container \"827afd0fe268d79933117c8266e31aabbecdf70bc47cc8ed1d11a50e306fd65f\": container with ID starting with 827afd0fe268d79933117c8266e31aabbecdf70bc47cc8ed1d11a50e306fd65f not found: ID does not exist" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.984763 4760 scope.go:117] "RemoveContainer" containerID="096ff25098794e61c05471779e2b1da5def0724c467733da81ab266ab873f450" Sep 30 07:53:31 crc kubenswrapper[4760]: E0930 07:53:31.985199 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"096ff25098794e61c05471779e2b1da5def0724c467733da81ab266ab873f450\": container with ID starting with 096ff25098794e61c05471779e2b1da5def0724c467733da81ab266ab873f450 not found: ID does not exist" containerID="096ff25098794e61c05471779e2b1da5def0724c467733da81ab266ab873f450" Sep 30 07:53:31 crc kubenswrapper[4760]: I0930 07:53:31.985220 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"096ff25098794e61c05471779e2b1da5def0724c467733da81ab266ab873f450"} err="failed to get container status \"096ff25098794e61c05471779e2b1da5def0724c467733da81ab266ab873f450\": rpc error: code = NotFound desc = could not find container \"096ff25098794e61c05471779e2b1da5def0724c467733da81ab266ab873f450\": container with ID starting with 096ff25098794e61c05471779e2b1da5def0724c467733da81ab266ab873f450 not found: ID does not exist" Sep 30 07:53:32 crc kubenswrapper[4760]: I0930 07:53:32.058239 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m74bp\" (UniqueName: \"kubernetes.io/projected/10427d42-4cfc-486e-931c-fd62a2a5b1e5-kube-api-access-m74bp\") pod \"nova-metadata-0\" (UID: \"10427d42-4cfc-486e-931c-fd62a2a5b1e5\") " pod="openstack/nova-metadata-0" Sep 30 07:53:32 crc kubenswrapper[4760]: I0930 07:53:32.058320 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10427d42-4cfc-486e-931c-fd62a2a5b1e5-config-data\") pod \"nova-metadata-0\" (UID: \"10427d42-4cfc-486e-931c-fd62a2a5b1e5\") " pod="openstack/nova-metadata-0" Sep 30 07:53:32 crc kubenswrapper[4760]: I0930 07:53:32.058373 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10427d42-4cfc-486e-931c-fd62a2a5b1e5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"10427d42-4cfc-486e-931c-fd62a2a5b1e5\") " pod="openstack/nova-metadata-0" Sep 30 07:53:32 crc kubenswrapper[4760]: I0930 07:53:32.058394 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/10427d42-4cfc-486e-931c-fd62a2a5b1e5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"10427d42-4cfc-486e-931c-fd62a2a5b1e5\") " pod="openstack/nova-metadata-0" Sep 30 07:53:32 crc kubenswrapper[4760]: I0930 07:53:32.058428 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10427d42-4cfc-486e-931c-fd62a2a5b1e5-logs\") pod \"nova-metadata-0\" (UID: \"10427d42-4cfc-486e-931c-fd62a2a5b1e5\") " pod="openstack/nova-metadata-0" Sep 30 07:53:32 crc kubenswrapper[4760]: I0930 07:53:32.102794 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 07:53:32 crc kubenswrapper[4760]: W0930 07:53:32.104117 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod868868bd_3879_4d24_9dd1_62218a15844c.slice/crio-ac18bdc459e935621c9b0154431b76c78c31ae8e4e73a6f6bff85eafda9fabae WatchSource:0}: Error finding container ac18bdc459e935621c9b0154431b76c78c31ae8e4e73a6f6bff85eafda9fabae: Status 404 returned error can't find the container with id ac18bdc459e935621c9b0154431b76c78c31ae8e4e73a6f6bff85eafda9fabae Sep 30 07:53:32 crc kubenswrapper[4760]: I0930 07:53:32.160496 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m74bp\" (UniqueName: \"kubernetes.io/projected/10427d42-4cfc-486e-931c-fd62a2a5b1e5-kube-api-access-m74bp\") pod \"nova-metadata-0\" (UID: \"10427d42-4cfc-486e-931c-fd62a2a5b1e5\") " pod="openstack/nova-metadata-0" Sep 30 07:53:32 crc kubenswrapper[4760]: I0930 07:53:32.160585 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10427d42-4cfc-486e-931c-fd62a2a5b1e5-config-data\") pod \"nova-metadata-0\" (UID: \"10427d42-4cfc-486e-931c-fd62a2a5b1e5\") " pod="openstack/nova-metadata-0" Sep 30 07:53:32 crc kubenswrapper[4760]: I0930 07:53:32.160673 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10427d42-4cfc-486e-931c-fd62a2a5b1e5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"10427d42-4cfc-486e-931c-fd62a2a5b1e5\") " pod="openstack/nova-metadata-0" Sep 30 07:53:32 crc kubenswrapper[4760]: I0930 07:53:32.160707 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/10427d42-4cfc-486e-931c-fd62a2a5b1e5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"10427d42-4cfc-486e-931c-fd62a2a5b1e5\") " pod="openstack/nova-metadata-0" Sep 30 07:53:32 crc kubenswrapper[4760]: I0930 07:53:32.160807 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10427d42-4cfc-486e-931c-fd62a2a5b1e5-logs\") pod \"nova-metadata-0\" (UID: \"10427d42-4cfc-486e-931c-fd62a2a5b1e5\") " pod="openstack/nova-metadata-0" Sep 30 07:53:32 crc kubenswrapper[4760]: I0930 07:53:32.161439 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10427d42-4cfc-486e-931c-fd62a2a5b1e5-logs\") pod \"nova-metadata-0\" (UID: \"10427d42-4cfc-486e-931c-fd62a2a5b1e5\") " pod="openstack/nova-metadata-0" Sep 30 07:53:32 crc kubenswrapper[4760]: I0930 07:53:32.164514 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10427d42-4cfc-486e-931c-fd62a2a5b1e5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"10427d42-4cfc-486e-931c-fd62a2a5b1e5\") " pod="openstack/nova-metadata-0" Sep 30 07:53:32 crc kubenswrapper[4760]: I0930 07:53:32.165466 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10427d42-4cfc-486e-931c-fd62a2a5b1e5-config-data\") pod \"nova-metadata-0\" (UID: \"10427d42-4cfc-486e-931c-fd62a2a5b1e5\") " pod="openstack/nova-metadata-0" Sep 30 07:53:32 crc kubenswrapper[4760]: I0930 07:53:32.165526 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/10427d42-4cfc-486e-931c-fd62a2a5b1e5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"10427d42-4cfc-486e-931c-fd62a2a5b1e5\") " pod="openstack/nova-metadata-0" Sep 30 07:53:32 crc kubenswrapper[4760]: I0930 07:53:32.187953 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m74bp\" (UniqueName: \"kubernetes.io/projected/10427d42-4cfc-486e-931c-fd62a2a5b1e5-kube-api-access-m74bp\") pod \"nova-metadata-0\" (UID: \"10427d42-4cfc-486e-931c-fd62a2a5b1e5\") " pod="openstack/nova-metadata-0" Sep 30 07:53:32 crc kubenswrapper[4760]: I0930 07:53:32.273504 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 07:53:32 crc kubenswrapper[4760]: I0930 07:53:32.768386 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 07:53:32 crc kubenswrapper[4760]: I0930 07:53:32.865990 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"10427d42-4cfc-486e-931c-fd62a2a5b1e5","Type":"ContainerStarted","Data":"1a63f3cbf002096492ab097267dcebd578403e3eeb730f8ebb779c0463dc53a1"} Sep 30 07:53:32 crc kubenswrapper[4760]: I0930 07:53:32.868177 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"868868bd-3879-4d24-9dd1-62218a15844c","Type":"ContainerStarted","Data":"cba822d1ca6358c245b37087819f35cff1ac22ac04181b6eaf76a43f9d7eeec9"} Sep 30 07:53:32 crc kubenswrapper[4760]: I0930 07:53:32.868211 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"868868bd-3879-4d24-9dd1-62218a15844c","Type":"ContainerStarted","Data":"ac18bdc459e935621c9b0154431b76c78c31ae8e4e73a6f6bff85eafda9fabae"} Sep 30 07:53:32 crc kubenswrapper[4760]: I0930 07:53:32.890985 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.890960408 podStartE2EDuration="1.890960408s" podCreationTimestamp="2025-09-30 07:53:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:53:32.887027868 +0000 UTC m=+1198.529934280" watchObservedRunningTime="2025-09-30 07:53:32.890960408 +0000 UTC m=+1198.533866830" Sep 30 07:53:33 crc kubenswrapper[4760]: I0930 07:53:33.078349 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d3f6c3d-070f-494e-8b47-856d54de039c" path="/var/lib/kubelet/pods/1d3f6c3d-070f-494e-8b47-856d54de039c/volumes" Sep 30 07:53:33 crc kubenswrapper[4760]: I0930 07:53:33.079203 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f8814ba-633a-440c-b866-98025a753fb1" path="/var/lib/kubelet/pods/6f8814ba-633a-440c-b866-98025a753fb1/volumes" Sep 30 07:53:33 crc kubenswrapper[4760]: I0930 07:53:33.886674 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"10427d42-4cfc-486e-931c-fd62a2a5b1e5","Type":"ContainerStarted","Data":"775f54e28ef4f129e0cf13919882c2588142b917ec641eb8b8edfceae2fa9914"} Sep 30 07:53:33 crc kubenswrapper[4760]: I0930 07:53:33.887767 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"10427d42-4cfc-486e-931c-fd62a2a5b1e5","Type":"ContainerStarted","Data":"091fdb6ffba9f8cc8a6d4f90935f99439b984e29acc54796a1f42d58f1e75bcd"} Sep 30 07:53:33 crc kubenswrapper[4760]: I0930 07:53:33.903420 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.903399899 podStartE2EDuration="2.903399899s" podCreationTimestamp="2025-09-30 07:53:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:53:33.902934928 +0000 UTC m=+1199.545841340" watchObservedRunningTime="2025-09-30 07:53:33.903399899 +0000 UTC m=+1199.546306311" Sep 30 07:53:36 crc kubenswrapper[4760]: I0930 07:53:36.569439 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 07:53:37 crc kubenswrapper[4760]: I0930 07:53:37.207494 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 07:53:37 crc kubenswrapper[4760]: I0930 07:53:37.207745 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 07:53:37 crc kubenswrapper[4760]: I0930 07:53:37.274609 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 07:53:37 crc kubenswrapper[4760]: I0930 07:53:37.274659 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 07:53:38 crc kubenswrapper[4760]: I0930 07:53:38.222456 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c27b43ce-27fb-4163-b55a-98a7e9ee7d71" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.219:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 07:53:38 crc kubenswrapper[4760]: I0930 07:53:38.222469 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c27b43ce-27fb-4163-b55a-98a7e9ee7d71" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.219:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 07:53:41 crc kubenswrapper[4760]: I0930 07:53:41.568847 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 07:53:41 crc kubenswrapper[4760]: I0930 07:53:41.605144 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 07:53:42 crc kubenswrapper[4760]: I0930 07:53:42.031269 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 07:53:42 crc kubenswrapper[4760]: I0930 07:53:42.274271 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 07:53:42 crc kubenswrapper[4760]: I0930 07:53:42.274345 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 07:53:43 crc kubenswrapper[4760]: I0930 07:53:43.286483 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="10427d42-4cfc-486e-931c-fd62a2a5b1e5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 07:53:43 crc kubenswrapper[4760]: I0930 07:53:43.287085 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="10427d42-4cfc-486e-931c-fd62a2a5b1e5" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 07:53:46 crc kubenswrapper[4760]: I0930 07:53:46.010490 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 07:53:47 crc kubenswrapper[4760]: I0930 07:53:47.218020 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 07:53:47 crc kubenswrapper[4760]: I0930 07:53:47.219178 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 07:53:47 crc kubenswrapper[4760]: I0930 07:53:47.220761 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 07:53:47 crc kubenswrapper[4760]: I0930 07:53:47.221160 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 07:53:47 crc kubenswrapper[4760]: I0930 07:53:47.231952 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 07:53:47 crc kubenswrapper[4760]: I0930 07:53:47.232751 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 07:53:52 crc kubenswrapper[4760]: I0930 07:53:52.279329 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 07:53:52 crc kubenswrapper[4760]: I0930 07:53:52.281099 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 07:53:52 crc kubenswrapper[4760]: I0930 07:53:52.284685 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 07:53:53 crc kubenswrapper[4760]: I0930 07:53:53.116443 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 07:54:01 crc kubenswrapper[4760]: I0930 07:54:01.402449 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 07:54:02 crc kubenswrapper[4760]: I0930 07:54:02.521842 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 07:54:05 crc kubenswrapper[4760]: I0930 07:54:05.575536 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="82b71e6c-ab34-447e-87e0-a95a9f070efe" containerName="rabbitmq" containerID="cri-o://86110cd0f7f1cd09512744889040451c585b3eae2d496a2583a787962daaf5e9" gracePeriod=604796 Sep 30 07:54:06 crc kubenswrapper[4760]: I0930 07:54:06.084648 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="82b71e6c-ab34-447e-87e0-a95a9f070efe" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Sep 30 07:54:06 crc kubenswrapper[4760]: I0930 07:54:06.879812 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="888bbd15-0d32-47ca-9f81-94eaf8f3c4df" containerName="rabbitmq" containerID="cri-o://9301f648fb76f95a8ef0ac897f3fc18acc757fd88a0050167f03a86da422b738" gracePeriod=604796 Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.337054 4760 generic.go:334] "Generic (PLEG): container finished" podID="82b71e6c-ab34-447e-87e0-a95a9f070efe" containerID="86110cd0f7f1cd09512744889040451c585b3eae2d496a2583a787962daaf5e9" exitCode=0 Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.337278 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"82b71e6c-ab34-447e-87e0-a95a9f070efe","Type":"ContainerDied","Data":"86110cd0f7f1cd09512744889040451c585b3eae2d496a2583a787962daaf5e9"} Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.496904 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.647660 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/82b71e6c-ab34-447e-87e0-a95a9f070efe-rabbitmq-erlang-cookie\") pod \"82b71e6c-ab34-447e-87e0-a95a9f070efe\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.647746 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/82b71e6c-ab34-447e-87e0-a95a9f070efe-pod-info\") pod \"82b71e6c-ab34-447e-87e0-a95a9f070efe\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.647898 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/82b71e6c-ab34-447e-87e0-a95a9f070efe-rabbitmq-tls\") pod \"82b71e6c-ab34-447e-87e0-a95a9f070efe\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.648341 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82b71e6c-ab34-447e-87e0-a95a9f070efe-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "82b71e6c-ab34-447e-87e0-a95a9f070efe" (UID: "82b71e6c-ab34-447e-87e0-a95a9f070efe"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.648908 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6zbk\" (UniqueName: \"kubernetes.io/projected/82b71e6c-ab34-447e-87e0-a95a9f070efe-kube-api-access-g6zbk\") pod \"82b71e6c-ab34-447e-87e0-a95a9f070efe\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.648942 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"82b71e6c-ab34-447e-87e0-a95a9f070efe\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.648992 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/82b71e6c-ab34-447e-87e0-a95a9f070efe-server-conf\") pod \"82b71e6c-ab34-447e-87e0-a95a9f070efe\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.649044 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/82b71e6c-ab34-447e-87e0-a95a9f070efe-rabbitmq-plugins\") pod \"82b71e6c-ab34-447e-87e0-a95a9f070efe\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.649104 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/82b71e6c-ab34-447e-87e0-a95a9f070efe-rabbitmq-confd\") pod \"82b71e6c-ab34-447e-87e0-a95a9f070efe\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.649205 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/82b71e6c-ab34-447e-87e0-a95a9f070efe-plugins-conf\") pod \"82b71e6c-ab34-447e-87e0-a95a9f070efe\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.649259 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/82b71e6c-ab34-447e-87e0-a95a9f070efe-erlang-cookie-secret\") pod \"82b71e6c-ab34-447e-87e0-a95a9f070efe\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.649379 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/82b71e6c-ab34-447e-87e0-a95a9f070efe-config-data\") pod \"82b71e6c-ab34-447e-87e0-a95a9f070efe\" (UID: \"82b71e6c-ab34-447e-87e0-a95a9f070efe\") " Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.649761 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82b71e6c-ab34-447e-87e0-a95a9f070efe-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "82b71e6c-ab34-447e-87e0-a95a9f070efe" (UID: "82b71e6c-ab34-447e-87e0-a95a9f070efe"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.649898 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82b71e6c-ab34-447e-87e0-a95a9f070efe-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "82b71e6c-ab34-447e-87e0-a95a9f070efe" (UID: "82b71e6c-ab34-447e-87e0-a95a9f070efe"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.650348 4760 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/82b71e6c-ab34-447e-87e0-a95a9f070efe-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.650369 4760 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/82b71e6c-ab34-447e-87e0-a95a9f070efe-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.650379 4760 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/82b71e6c-ab34-447e-87e0-a95a9f070efe-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.653916 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82b71e6c-ab34-447e-87e0-a95a9f070efe-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "82b71e6c-ab34-447e-87e0-a95a9f070efe" (UID: "82b71e6c-ab34-447e-87e0-a95a9f070efe"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.654162 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82b71e6c-ab34-447e-87e0-a95a9f070efe-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "82b71e6c-ab34-447e-87e0-a95a9f070efe" (UID: "82b71e6c-ab34-447e-87e0-a95a9f070efe"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.672034 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82b71e6c-ab34-447e-87e0-a95a9f070efe-kube-api-access-g6zbk" (OuterVolumeSpecName: "kube-api-access-g6zbk") pod "82b71e6c-ab34-447e-87e0-a95a9f070efe" (UID: "82b71e6c-ab34-447e-87e0-a95a9f070efe"). InnerVolumeSpecName "kube-api-access-g6zbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.673003 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/82b71e6c-ab34-447e-87e0-a95a9f070efe-pod-info" (OuterVolumeSpecName: "pod-info") pod "82b71e6c-ab34-447e-87e0-a95a9f070efe" (UID: "82b71e6c-ab34-447e-87e0-a95a9f070efe"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.673799 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "82b71e6c-ab34-447e-87e0-a95a9f070efe" (UID: "82b71e6c-ab34-447e-87e0-a95a9f070efe"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.705575 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82b71e6c-ab34-447e-87e0-a95a9f070efe-config-data" (OuterVolumeSpecName: "config-data") pod "82b71e6c-ab34-447e-87e0-a95a9f070efe" (UID: "82b71e6c-ab34-447e-87e0-a95a9f070efe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.725289 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82b71e6c-ab34-447e-87e0-a95a9f070efe-server-conf" (OuterVolumeSpecName: "server-conf") pod "82b71e6c-ab34-447e-87e0-a95a9f070efe" (UID: "82b71e6c-ab34-447e-87e0-a95a9f070efe"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.756640 4760 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/82b71e6c-ab34-447e-87e0-a95a9f070efe-pod-info\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.756680 4760 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/82b71e6c-ab34-447e-87e0-a95a9f070efe-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.756696 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6zbk\" (UniqueName: \"kubernetes.io/projected/82b71e6c-ab34-447e-87e0-a95a9f070efe-kube-api-access-g6zbk\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.756724 4760 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.756737 4760 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/82b71e6c-ab34-447e-87e0-a95a9f070efe-server-conf\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.756749 4760 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/82b71e6c-ab34-447e-87e0-a95a9f070efe-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.756760 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/82b71e6c-ab34-447e-87e0-a95a9f070efe-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.784778 4760 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.798340 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82b71e6c-ab34-447e-87e0-a95a9f070efe-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "82b71e6c-ab34-447e-87e0-a95a9f070efe" (UID: "82b71e6c-ab34-447e-87e0-a95a9f070efe"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.864069 4760 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:12 crc kubenswrapper[4760]: I0930 07:54:12.864105 4760 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/82b71e6c-ab34-447e-87e0-a95a9f070efe-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.365478 4760 generic.go:334] "Generic (PLEG): container finished" podID="888bbd15-0d32-47ca-9f81-94eaf8f3c4df" containerID="9301f648fb76f95a8ef0ac897f3fc18acc757fd88a0050167f03a86da422b738" exitCode=0 Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.365827 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"888bbd15-0d32-47ca-9f81-94eaf8f3c4df","Type":"ContainerDied","Data":"9301f648fb76f95a8ef0ac897f3fc18acc757fd88a0050167f03a86da422b738"} Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.371007 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"82b71e6c-ab34-447e-87e0-a95a9f070efe","Type":"ContainerDied","Data":"1773a95e636378477ff3ac5f0e1367a058d156fbd8e076ab13e602f4bbee54e8"} Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.371067 4760 scope.go:117] "RemoveContainer" containerID="86110cd0f7f1cd09512744889040451c585b3eae2d496a2583a787962daaf5e9" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.371227 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.410780 4760 scope.go:117] "RemoveContainer" containerID="6b118dc533d1475f2056129842cbda4e9708447c504c86b0c22d38fcd2a5b9b2" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.422635 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.440434 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.466371 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 07:54:13 crc kubenswrapper[4760]: E0930 07:54:13.466840 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82b71e6c-ab34-447e-87e0-a95a9f070efe" containerName="setup-container" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.466853 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="82b71e6c-ab34-447e-87e0-a95a9f070efe" containerName="setup-container" Sep 30 07:54:13 crc kubenswrapper[4760]: E0930 07:54:13.466871 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82b71e6c-ab34-447e-87e0-a95a9f070efe" containerName="rabbitmq" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.466877 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="82b71e6c-ab34-447e-87e0-a95a9f070efe" containerName="rabbitmq" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.467094 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="82b71e6c-ab34-447e-87e0-a95a9f070efe" containerName="rabbitmq" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.468131 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.470177 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.473433 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.473667 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.473668 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.474727 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.478997 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.480744 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ptvxd" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.480907 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.553848 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.581427 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.581496 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.581527 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.581578 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.581609 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-config-data\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.581638 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.581677 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.581706 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.581724 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.581755 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.581893 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vftmn\" (UniqueName: \"kubernetes.io/projected/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-kube-api-access-vftmn\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.682844 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcjvh\" (UniqueName: \"kubernetes.io/projected/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-kube-api-access-lcjvh\") pod \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.682953 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-config-data\") pod \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.683043 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-erlang-cookie-secret\") pod \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.683076 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-rabbitmq-plugins\") pod \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.683575 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "888bbd15-0d32-47ca-9f81-94eaf8f3c4df" (UID: "888bbd15-0d32-47ca-9f81-94eaf8f3c4df"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.684040 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-rabbitmq-erlang-cookie\") pod \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.684067 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-rabbitmq-tls\") pod \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.684110 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-plugins-conf\") pod \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.684142 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.684228 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-server-conf\") pod \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.684256 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-pod-info\") pod \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.684311 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-rabbitmq-confd\") pod \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\" (UID: \"888bbd15-0d32-47ca-9f81-94eaf8f3c4df\") " Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.684414 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "888bbd15-0d32-47ca-9f81-94eaf8f3c4df" (UID: "888bbd15-0d32-47ca-9f81-94eaf8f3c4df"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.684560 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.684612 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-config-data\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.684687 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.684704 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "888bbd15-0d32-47ca-9f81-94eaf8f3c4df" (UID: "888bbd15-0d32-47ca-9f81-94eaf8f3c4df"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.685431 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-config-data\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.685750 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.685791 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.685817 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.685861 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.685885 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vftmn\" (UniqueName: \"kubernetes.io/projected/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-kube-api-access-vftmn\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.685935 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.685998 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.686029 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.686092 4760 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.686103 4760 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.686114 4760 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.686248 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.687632 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.687805 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.689543 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.689839 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-pod-info" (OuterVolumeSpecName: "pod-info") pod "888bbd15-0d32-47ca-9f81-94eaf8f3c4df" (UID: "888bbd15-0d32-47ca-9f81-94eaf8f3c4df"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.690646 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "888bbd15-0d32-47ca-9f81-94eaf8f3c4df" (UID: "888bbd15-0d32-47ca-9f81-94eaf8f3c4df"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.690676 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-kube-api-access-lcjvh" (OuterVolumeSpecName: "kube-api-access-lcjvh") pod "888bbd15-0d32-47ca-9f81-94eaf8f3c4df" (UID: "888bbd15-0d32-47ca-9f81-94eaf8f3c4df"). InnerVolumeSpecName "kube-api-access-lcjvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.690788 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.693038 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "888bbd15-0d32-47ca-9f81-94eaf8f3c4df" (UID: "888bbd15-0d32-47ca-9f81-94eaf8f3c4df"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.693550 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.693716 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "888bbd15-0d32-47ca-9f81-94eaf8f3c4df" (UID: "888bbd15-0d32-47ca-9f81-94eaf8f3c4df"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.694843 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.696979 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.698698 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.708050 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vftmn\" (UniqueName: \"kubernetes.io/projected/7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac-kube-api-access-vftmn\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.737774 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-config-data" (OuterVolumeSpecName: "config-data") pod "888bbd15-0d32-47ca-9f81-94eaf8f3c4df" (UID: "888bbd15-0d32-47ca-9f81-94eaf8f3c4df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.738497 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac\") " pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.758757 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-server-conf" (OuterVolumeSpecName: "server-conf") pod "888bbd15-0d32-47ca-9f81-94eaf8f3c4df" (UID: "888bbd15-0d32-47ca-9f81-94eaf8f3c4df"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.788794 4760 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.788836 4760 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.789101 4760 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.789193 4760 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-server-conf\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.789248 4760 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-pod-info\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.789318 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcjvh\" (UniqueName: \"kubernetes.io/projected/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-kube-api-access-lcjvh\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.789407 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.826404 4760 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.841644 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "888bbd15-0d32-47ca-9f81-94eaf8f3c4df" (UID: "888bbd15-0d32-47ca-9f81-94eaf8f3c4df"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.853437 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.891089 4760 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:13 crc kubenswrapper[4760]: I0930 07:54:13.891125 4760 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/888bbd15-0d32-47ca-9f81-94eaf8f3c4df-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.314250 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.381330 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"888bbd15-0d32-47ca-9f81-94eaf8f3c4df","Type":"ContainerDied","Data":"0a065cdb91978c59cd10d49cd43725b135a52e29266c8c13fece95c6c52b40e2"} Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.381393 4760 scope.go:117] "RemoveContainer" containerID="9301f648fb76f95a8ef0ac897f3fc18acc757fd88a0050167f03a86da422b738" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.381390 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.385595 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac","Type":"ContainerStarted","Data":"b0f609280a8fda6433372c72b83088bf825da059da9b801ab57794a195de1688"} Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.452509 4760 scope.go:117] "RemoveContainer" containerID="039b885c04db48743019cb8fd332d719e0236e3f107ba92ef62a4f193fe33d92" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.501384 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.532068 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.546505 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 07:54:14 crc kubenswrapper[4760]: E0930 07:54:14.546966 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="888bbd15-0d32-47ca-9f81-94eaf8f3c4df" containerName="setup-container" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.546989 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="888bbd15-0d32-47ca-9f81-94eaf8f3c4df" containerName="setup-container" Sep 30 07:54:14 crc kubenswrapper[4760]: E0930 07:54:14.547027 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="888bbd15-0d32-47ca-9f81-94eaf8f3c4df" containerName="rabbitmq" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.547036 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="888bbd15-0d32-47ca-9f81-94eaf8f3c4df" containerName="rabbitmq" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.547281 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="888bbd15-0d32-47ca-9f81-94eaf8f3c4df" containerName="rabbitmq" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.549582 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.553793 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mbsbc" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.553992 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.554035 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.554138 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.554273 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.554831 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.554987 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.559486 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.716891 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.716961 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.717173 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.717272 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.717502 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.717926 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r68sz\" (UniqueName: \"kubernetes.io/projected/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-kube-api-access-r68sz\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.718107 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.718265 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.718367 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.718431 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.718537 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.820524 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.820587 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r68sz\" (UniqueName: \"kubernetes.io/projected/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-kube-api-access-r68sz\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.820631 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.820662 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.820690 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.820729 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.820757 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.820955 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.820980 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.820995 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.821370 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.821520 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.821575 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.821629 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.821770 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.822280 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.822601 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.826484 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.827607 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.827970 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.827992 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.841054 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r68sz\" (UniqueName: \"kubernetes.io/projected/2750016d-97a4-4e2b-a0e8-a03ddd6d64bb-kube-api-access-r68sz\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.854009 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:14 crc kubenswrapper[4760]: I0930 07:54:14.883163 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:15 crc kubenswrapper[4760]: I0930 07:54:15.082657 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82b71e6c-ab34-447e-87e0-a95a9f070efe" path="/var/lib/kubelet/pods/82b71e6c-ab34-447e-87e0-a95a9f070efe/volumes" Sep 30 07:54:15 crc kubenswrapper[4760]: I0930 07:54:15.084194 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="888bbd15-0d32-47ca-9f81-94eaf8f3c4df" path="/var/lib/kubelet/pods/888bbd15-0d32-47ca-9f81-94eaf8f3c4df/volumes" Sep 30 07:54:15 crc kubenswrapper[4760]: I0930 07:54:15.397581 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac","Type":"ContainerStarted","Data":"19e050112983eb55da6e785daba7de7c6cf9a59af4cd4c49cc450752e3b2b2f8"} Sep 30 07:54:15 crc kubenswrapper[4760]: I0930 07:54:15.433902 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 07:54:16 crc kubenswrapper[4760]: I0930 07:54:16.409903 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb","Type":"ContainerStarted","Data":"7b94ed839ce69f2e063b82db1a98aaad8ce1fc6eeb358af912177c982043ee4c"} Sep 30 07:54:16 crc kubenswrapper[4760]: I0930 07:54:16.410243 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb","Type":"ContainerStarted","Data":"7dbb803ca9cde579d262d3a5b2da29c63b1c4764b4ad6a063aa2d8388b38b101"} Sep 30 07:54:16 crc kubenswrapper[4760]: I0930 07:54:16.869315 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-h8jc7"] Sep 30 07:54:16 crc kubenswrapper[4760]: I0930 07:54:16.871259 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" Sep 30 07:54:16 crc kubenswrapper[4760]: I0930 07:54:16.888979 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-h8jc7"] Sep 30 07:54:16 crc kubenswrapper[4760]: I0930 07:54:16.896417 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Sep 30 07:54:16 crc kubenswrapper[4760]: I0930 07:54:16.988810 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tvkq\" (UniqueName: \"kubernetes.io/projected/776888c0-31ec-4da6-833f-eb693b220b39-kube-api-access-8tvkq\") pod \"dnsmasq-dns-5576978c7c-h8jc7\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" Sep 30 07:54:16 crc kubenswrapper[4760]: I0930 07:54:16.988912 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-h8jc7\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" Sep 30 07:54:16 crc kubenswrapper[4760]: I0930 07:54:16.989019 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-config\") pod \"dnsmasq-dns-5576978c7c-h8jc7\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" Sep 30 07:54:16 crc kubenswrapper[4760]: I0930 07:54:16.989068 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-h8jc7\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" Sep 30 07:54:16 crc kubenswrapper[4760]: I0930 07:54:16.989107 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-dns-svc\") pod \"dnsmasq-dns-5576978c7c-h8jc7\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" Sep 30 07:54:16 crc kubenswrapper[4760]: I0930 07:54:16.989138 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-h8jc7\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" Sep 30 07:54:16 crc kubenswrapper[4760]: I0930 07:54:16.989165 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-h8jc7\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" Sep 30 07:54:17 crc kubenswrapper[4760]: I0930 07:54:17.091389 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tvkq\" (UniqueName: \"kubernetes.io/projected/776888c0-31ec-4da6-833f-eb693b220b39-kube-api-access-8tvkq\") pod \"dnsmasq-dns-5576978c7c-h8jc7\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" Sep 30 07:54:17 crc kubenswrapper[4760]: I0930 07:54:17.091476 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-h8jc7\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" Sep 30 07:54:17 crc kubenswrapper[4760]: I0930 07:54:17.091521 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-config\") pod \"dnsmasq-dns-5576978c7c-h8jc7\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" Sep 30 07:54:17 crc kubenswrapper[4760]: I0930 07:54:17.091556 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-h8jc7\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" Sep 30 07:54:17 crc kubenswrapper[4760]: I0930 07:54:17.091594 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-dns-svc\") pod \"dnsmasq-dns-5576978c7c-h8jc7\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" Sep 30 07:54:17 crc kubenswrapper[4760]: I0930 07:54:17.091625 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-h8jc7\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" Sep 30 07:54:17 crc kubenswrapper[4760]: I0930 07:54:17.091648 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-h8jc7\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" Sep 30 07:54:17 crc kubenswrapper[4760]: I0930 07:54:17.093214 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-h8jc7\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" Sep 30 07:54:17 crc kubenswrapper[4760]: I0930 07:54:17.093346 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-dns-svc\") pod \"dnsmasq-dns-5576978c7c-h8jc7\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" Sep 30 07:54:17 crc kubenswrapper[4760]: I0930 07:54:17.093361 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-h8jc7\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" Sep 30 07:54:17 crc kubenswrapper[4760]: I0930 07:54:17.093490 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-config\") pod \"dnsmasq-dns-5576978c7c-h8jc7\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" Sep 30 07:54:17 crc kubenswrapper[4760]: I0930 07:54:17.093512 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-h8jc7\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" Sep 30 07:54:17 crc kubenswrapper[4760]: I0930 07:54:17.093587 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-h8jc7\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" Sep 30 07:54:17 crc kubenswrapper[4760]: I0930 07:54:17.113315 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tvkq\" (UniqueName: \"kubernetes.io/projected/776888c0-31ec-4da6-833f-eb693b220b39-kube-api-access-8tvkq\") pod \"dnsmasq-dns-5576978c7c-h8jc7\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" Sep 30 07:54:17 crc kubenswrapper[4760]: I0930 07:54:17.231901 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" Sep 30 07:54:17 crc kubenswrapper[4760]: I0930 07:54:17.768619 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-h8jc7"] Sep 30 07:54:18 crc kubenswrapper[4760]: I0930 07:54:18.434072 4760 generic.go:334] "Generic (PLEG): container finished" podID="776888c0-31ec-4da6-833f-eb693b220b39" containerID="2a57d3fb2154214236af93320d15c60d6f91b76a855069ba15e0e932e483c677" exitCode=0 Sep 30 07:54:18 crc kubenswrapper[4760]: I0930 07:54:18.434126 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" event={"ID":"776888c0-31ec-4da6-833f-eb693b220b39","Type":"ContainerDied","Data":"2a57d3fb2154214236af93320d15c60d6f91b76a855069ba15e0e932e483c677"} Sep 30 07:54:18 crc kubenswrapper[4760]: I0930 07:54:18.434345 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" event={"ID":"776888c0-31ec-4da6-833f-eb693b220b39","Type":"ContainerStarted","Data":"23d681a172e00a907dfafe06f431805a1d69f51e94a0cdd054438d82e5a80cfe"} Sep 30 07:54:19 crc kubenswrapper[4760]: I0930 07:54:19.113745 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:54:19 crc kubenswrapper[4760]: I0930 07:54:19.114183 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:54:19 crc kubenswrapper[4760]: I0930 07:54:19.448977 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" event={"ID":"776888c0-31ec-4da6-833f-eb693b220b39","Type":"ContainerStarted","Data":"31964a842333505571a8b8bc29538421cab6081e8ec0f597f832b5cc650d5852"} Sep 30 07:54:19 crc kubenswrapper[4760]: I0930 07:54:19.449171 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" Sep 30 07:54:19 crc kubenswrapper[4760]: I0930 07:54:19.484217 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" podStartSLOduration=3.484201901 podStartE2EDuration="3.484201901s" podCreationTimestamp="2025-09-30 07:54:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:54:19.472756319 +0000 UTC m=+1245.115662731" watchObservedRunningTime="2025-09-30 07:54:19.484201901 +0000 UTC m=+1245.127108313" Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.234671 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.319658 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-fdkh2"] Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.319923 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" podUID="82d1309a-4ecf-430e-84ad-69622bb9d9a6" containerName="dnsmasq-dns" containerID="cri-o://74a99f7cbd2e546d79211602bf2a603b54e89b44cf72780a50b070ef800208ae" gracePeriod=10 Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.470350 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c8665b49f-cp9sh"] Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.471945 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.485955 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c8665b49f-cp9sh"] Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.545239 4760 generic.go:334] "Generic (PLEG): container finished" podID="82d1309a-4ecf-430e-84ad-69622bb9d9a6" containerID="74a99f7cbd2e546d79211602bf2a603b54e89b44cf72780a50b070ef800208ae" exitCode=0 Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.545288 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" event={"ID":"82d1309a-4ecf-430e-84ad-69622bb9d9a6","Type":"ContainerDied","Data":"74a99f7cbd2e546d79211602bf2a603b54e89b44cf72780a50b070ef800208ae"} Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.626498 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e0ff1e1-cda6-4574-a353-f4a7406326e7-ovsdbserver-sb\") pod \"dnsmasq-dns-7c8665b49f-cp9sh\" (UID: \"5e0ff1e1-cda6-4574-a353-f4a7406326e7\") " pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.626582 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5e0ff1e1-cda6-4574-a353-f4a7406326e7-openstack-edpm-ipam\") pod \"dnsmasq-dns-7c8665b49f-cp9sh\" (UID: \"5e0ff1e1-cda6-4574-a353-f4a7406326e7\") " pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.626667 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e0ff1e1-cda6-4574-a353-f4a7406326e7-dns-swift-storage-0\") pod \"dnsmasq-dns-7c8665b49f-cp9sh\" (UID: \"5e0ff1e1-cda6-4574-a353-f4a7406326e7\") " pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.626737 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxqpd\" (UniqueName: \"kubernetes.io/projected/5e0ff1e1-cda6-4574-a353-f4a7406326e7-kube-api-access-sxqpd\") pod \"dnsmasq-dns-7c8665b49f-cp9sh\" (UID: \"5e0ff1e1-cda6-4574-a353-f4a7406326e7\") " pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.626762 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e0ff1e1-cda6-4574-a353-f4a7406326e7-config\") pod \"dnsmasq-dns-7c8665b49f-cp9sh\" (UID: \"5e0ff1e1-cda6-4574-a353-f4a7406326e7\") " pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.626795 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e0ff1e1-cda6-4574-a353-f4a7406326e7-dns-svc\") pod \"dnsmasq-dns-7c8665b49f-cp9sh\" (UID: \"5e0ff1e1-cda6-4574-a353-f4a7406326e7\") " pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.626852 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e0ff1e1-cda6-4574-a353-f4a7406326e7-ovsdbserver-nb\") pod \"dnsmasq-dns-7c8665b49f-cp9sh\" (UID: \"5e0ff1e1-cda6-4574-a353-f4a7406326e7\") " pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.728658 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e0ff1e1-cda6-4574-a353-f4a7406326e7-ovsdbserver-nb\") pod \"dnsmasq-dns-7c8665b49f-cp9sh\" (UID: \"5e0ff1e1-cda6-4574-a353-f4a7406326e7\") " pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.728780 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e0ff1e1-cda6-4574-a353-f4a7406326e7-ovsdbserver-sb\") pod \"dnsmasq-dns-7c8665b49f-cp9sh\" (UID: \"5e0ff1e1-cda6-4574-a353-f4a7406326e7\") " pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.729592 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e0ff1e1-cda6-4574-a353-f4a7406326e7-ovsdbserver-nb\") pod \"dnsmasq-dns-7c8665b49f-cp9sh\" (UID: \"5e0ff1e1-cda6-4574-a353-f4a7406326e7\") " pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.729726 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e0ff1e1-cda6-4574-a353-f4a7406326e7-ovsdbserver-sb\") pod \"dnsmasq-dns-7c8665b49f-cp9sh\" (UID: \"5e0ff1e1-cda6-4574-a353-f4a7406326e7\") " pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.729842 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5e0ff1e1-cda6-4574-a353-f4a7406326e7-openstack-edpm-ipam\") pod \"dnsmasq-dns-7c8665b49f-cp9sh\" (UID: \"5e0ff1e1-cda6-4574-a353-f4a7406326e7\") " pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.730631 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5e0ff1e1-cda6-4574-a353-f4a7406326e7-openstack-edpm-ipam\") pod \"dnsmasq-dns-7c8665b49f-cp9sh\" (UID: \"5e0ff1e1-cda6-4574-a353-f4a7406326e7\") " pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.730818 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e0ff1e1-cda6-4574-a353-f4a7406326e7-dns-swift-storage-0\") pod \"dnsmasq-dns-7c8665b49f-cp9sh\" (UID: \"5e0ff1e1-cda6-4574-a353-f4a7406326e7\") " pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.731525 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e0ff1e1-cda6-4574-a353-f4a7406326e7-dns-swift-storage-0\") pod \"dnsmasq-dns-7c8665b49f-cp9sh\" (UID: \"5e0ff1e1-cda6-4574-a353-f4a7406326e7\") " pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.731692 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxqpd\" (UniqueName: \"kubernetes.io/projected/5e0ff1e1-cda6-4574-a353-f4a7406326e7-kube-api-access-sxqpd\") pod \"dnsmasq-dns-7c8665b49f-cp9sh\" (UID: \"5e0ff1e1-cda6-4574-a353-f4a7406326e7\") " pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.731726 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e0ff1e1-cda6-4574-a353-f4a7406326e7-config\") pod \"dnsmasq-dns-7c8665b49f-cp9sh\" (UID: \"5e0ff1e1-cda6-4574-a353-f4a7406326e7\") " pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.732679 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e0ff1e1-cda6-4574-a353-f4a7406326e7-config\") pod \"dnsmasq-dns-7c8665b49f-cp9sh\" (UID: \"5e0ff1e1-cda6-4574-a353-f4a7406326e7\") " pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.732814 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e0ff1e1-cda6-4574-a353-f4a7406326e7-dns-svc\") pod \"dnsmasq-dns-7c8665b49f-cp9sh\" (UID: \"5e0ff1e1-cda6-4574-a353-f4a7406326e7\") " pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.733513 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e0ff1e1-cda6-4574-a353-f4a7406326e7-dns-svc\") pod \"dnsmasq-dns-7c8665b49f-cp9sh\" (UID: \"5e0ff1e1-cda6-4574-a353-f4a7406326e7\") " pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.759008 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxqpd\" (UniqueName: \"kubernetes.io/projected/5e0ff1e1-cda6-4574-a353-f4a7406326e7-kube-api-access-sxqpd\") pod \"dnsmasq-dns-7c8665b49f-cp9sh\" (UID: \"5e0ff1e1-cda6-4574-a353-f4a7406326e7\") " pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" Sep 30 07:54:27 crc kubenswrapper[4760]: I0930 07:54:27.888615 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" Sep 30 07:54:28 crc kubenswrapper[4760]: I0930 07:54:28.366874 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c8665b49f-cp9sh"] Sep 30 07:54:28 crc kubenswrapper[4760]: I0930 07:54:28.418702 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" Sep 30 07:54:28 crc kubenswrapper[4760]: I0930 07:54:28.559691 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-ovsdbserver-nb\") pod \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\" (UID: \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\") " Sep 30 07:54:28 crc kubenswrapper[4760]: I0930 07:54:28.559729 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-dns-svc\") pod \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\" (UID: \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\") " Sep 30 07:54:28 crc kubenswrapper[4760]: I0930 07:54:28.559761 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-dns-swift-storage-0\") pod \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\" (UID: \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\") " Sep 30 07:54:28 crc kubenswrapper[4760]: I0930 07:54:28.559793 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-ovsdbserver-sb\") pod \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\" (UID: \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\") " Sep 30 07:54:28 crc kubenswrapper[4760]: I0930 07:54:28.559887 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-config\") pod \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\" (UID: \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\") " Sep 30 07:54:28 crc kubenswrapper[4760]: I0930 07:54:28.559937 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptnqk\" (UniqueName: \"kubernetes.io/projected/82d1309a-4ecf-430e-84ad-69622bb9d9a6-kube-api-access-ptnqk\") pod \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\" (UID: \"82d1309a-4ecf-430e-84ad-69622bb9d9a6\") " Sep 30 07:54:28 crc kubenswrapper[4760]: I0930 07:54:28.583524 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" event={"ID":"5e0ff1e1-cda6-4574-a353-f4a7406326e7","Type":"ContainerStarted","Data":"b39aacde73870491f33363ee9bfb45585b1658e503612a9a811caf69300db0f0"} Sep 30 07:54:28 crc kubenswrapper[4760]: I0930 07:54:28.596663 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82d1309a-4ecf-430e-84ad-69622bb9d9a6-kube-api-access-ptnqk" (OuterVolumeSpecName: "kube-api-access-ptnqk") pod "82d1309a-4ecf-430e-84ad-69622bb9d9a6" (UID: "82d1309a-4ecf-430e-84ad-69622bb9d9a6"). InnerVolumeSpecName "kube-api-access-ptnqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:54:28 crc kubenswrapper[4760]: I0930 07:54:28.608719 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" event={"ID":"82d1309a-4ecf-430e-84ad-69622bb9d9a6","Type":"ContainerDied","Data":"d04cb692e2f61b5eeaddeb2cd80a94a32ddb5e3d8c1f16d488d78d402b8315e0"} Sep 30 07:54:28 crc kubenswrapper[4760]: I0930 07:54:28.608783 4760 scope.go:117] "RemoveContainer" containerID="74a99f7cbd2e546d79211602bf2a603b54e89b44cf72780a50b070ef800208ae" Sep 30 07:54:28 crc kubenswrapper[4760]: I0930 07:54:28.608964 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-fdkh2" Sep 30 07:54:28 crc kubenswrapper[4760]: I0930 07:54:28.637946 4760 scope.go:117] "RemoveContainer" containerID="bdc5eecdf5d13a7cc6cdb9204f36b34e053fbd9e48a24fd4491f5e1ebe5b269d" Sep 30 07:54:28 crc kubenswrapper[4760]: I0930 07:54:28.663606 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptnqk\" (UniqueName: \"kubernetes.io/projected/82d1309a-4ecf-430e-84ad-69622bb9d9a6-kube-api-access-ptnqk\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:28 crc kubenswrapper[4760]: I0930 07:54:28.735113 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "82d1309a-4ecf-430e-84ad-69622bb9d9a6" (UID: "82d1309a-4ecf-430e-84ad-69622bb9d9a6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:54:28 crc kubenswrapper[4760]: I0930 07:54:28.737391 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "82d1309a-4ecf-430e-84ad-69622bb9d9a6" (UID: "82d1309a-4ecf-430e-84ad-69622bb9d9a6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:54:28 crc kubenswrapper[4760]: I0930 07:54:28.737704 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "82d1309a-4ecf-430e-84ad-69622bb9d9a6" (UID: "82d1309a-4ecf-430e-84ad-69622bb9d9a6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:54:28 crc kubenswrapper[4760]: I0930 07:54:28.743798 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "82d1309a-4ecf-430e-84ad-69622bb9d9a6" (UID: "82d1309a-4ecf-430e-84ad-69622bb9d9a6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:54:28 crc kubenswrapper[4760]: I0930 07:54:28.746145 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-config" (OuterVolumeSpecName: "config") pod "82d1309a-4ecf-430e-84ad-69622bb9d9a6" (UID: "82d1309a-4ecf-430e-84ad-69622bb9d9a6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:54:28 crc kubenswrapper[4760]: I0930 07:54:28.765835 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:28 crc kubenswrapper[4760]: I0930 07:54:28.765883 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:28 crc kubenswrapper[4760]: I0930 07:54:28.765894 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:28 crc kubenswrapper[4760]: I0930 07:54:28.765904 4760 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:28 crc kubenswrapper[4760]: I0930 07:54:28.765912 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82d1309a-4ecf-430e-84ad-69622bb9d9a6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:28 crc kubenswrapper[4760]: I0930 07:54:28.967002 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-fdkh2"] Sep 30 07:54:28 crc kubenswrapper[4760]: I0930 07:54:28.980214 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-fdkh2"] Sep 30 07:54:29 crc kubenswrapper[4760]: I0930 07:54:29.088453 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82d1309a-4ecf-430e-84ad-69622bb9d9a6" path="/var/lib/kubelet/pods/82d1309a-4ecf-430e-84ad-69622bb9d9a6/volumes" Sep 30 07:54:29 crc kubenswrapper[4760]: I0930 07:54:29.621426 4760 generic.go:334] "Generic (PLEG): container finished" podID="5e0ff1e1-cda6-4574-a353-f4a7406326e7" containerID="f5ee3a0a30eb7eefd0ccc001c0d6d0ea36656b31f0be53d3f2965ea13a43b20e" exitCode=0 Sep 30 07:54:29 crc kubenswrapper[4760]: I0930 07:54:29.621553 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" event={"ID":"5e0ff1e1-cda6-4574-a353-f4a7406326e7","Type":"ContainerDied","Data":"f5ee3a0a30eb7eefd0ccc001c0d6d0ea36656b31f0be53d3f2965ea13a43b20e"} Sep 30 07:54:30 crc kubenswrapper[4760]: I0930 07:54:30.636235 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" event={"ID":"5e0ff1e1-cda6-4574-a353-f4a7406326e7","Type":"ContainerStarted","Data":"d27fafd83d43758a989f9820d05399c68089770e35f61b521fedbe78af1302d1"} Sep 30 07:54:30 crc kubenswrapper[4760]: I0930 07:54:30.637343 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" Sep 30 07:54:30 crc kubenswrapper[4760]: I0930 07:54:30.663091 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" podStartSLOduration=3.663072788 podStartE2EDuration="3.663072788s" podCreationTimestamp="2025-09-30 07:54:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:54:30.652559929 +0000 UTC m=+1256.295466381" watchObservedRunningTime="2025-09-30 07:54:30.663072788 +0000 UTC m=+1256.305979200" Sep 30 07:54:37 crc kubenswrapper[4760]: I0930 07:54:37.890463 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c8665b49f-cp9sh" Sep 30 07:54:37 crc kubenswrapper[4760]: I0930 07:54:37.967357 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-h8jc7"] Sep 30 07:54:37 crc kubenswrapper[4760]: I0930 07:54:37.967620 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" podUID="776888c0-31ec-4da6-833f-eb693b220b39" containerName="dnsmasq-dns" containerID="cri-o://31964a842333505571a8b8bc29538421cab6081e8ec0f597f832b5cc650d5852" gracePeriod=10 Sep 30 07:54:38 crc kubenswrapper[4760]: I0930 07:54:38.729988 4760 generic.go:334] "Generic (PLEG): container finished" podID="776888c0-31ec-4da6-833f-eb693b220b39" containerID="31964a842333505571a8b8bc29538421cab6081e8ec0f597f832b5cc650d5852" exitCode=0 Sep 30 07:54:38 crc kubenswrapper[4760]: I0930 07:54:38.730039 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" event={"ID":"776888c0-31ec-4da6-833f-eb693b220b39","Type":"ContainerDied","Data":"31964a842333505571a8b8bc29538421cab6081e8ec0f597f832b5cc650d5852"} Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.019238 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.193108 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-dns-swift-storage-0\") pod \"776888c0-31ec-4da6-833f-eb693b220b39\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.193223 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-ovsdbserver-nb\") pod \"776888c0-31ec-4da6-833f-eb693b220b39\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.193387 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-config\") pod \"776888c0-31ec-4da6-833f-eb693b220b39\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.193443 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-openstack-edpm-ipam\") pod \"776888c0-31ec-4da6-833f-eb693b220b39\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.193551 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-dns-svc\") pod \"776888c0-31ec-4da6-833f-eb693b220b39\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.193648 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tvkq\" (UniqueName: \"kubernetes.io/projected/776888c0-31ec-4da6-833f-eb693b220b39-kube-api-access-8tvkq\") pod \"776888c0-31ec-4da6-833f-eb693b220b39\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.193749 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-ovsdbserver-sb\") pod \"776888c0-31ec-4da6-833f-eb693b220b39\" (UID: \"776888c0-31ec-4da6-833f-eb693b220b39\") " Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.200731 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/776888c0-31ec-4da6-833f-eb693b220b39-kube-api-access-8tvkq" (OuterVolumeSpecName: "kube-api-access-8tvkq") pod "776888c0-31ec-4da6-833f-eb693b220b39" (UID: "776888c0-31ec-4da6-833f-eb693b220b39"). InnerVolumeSpecName "kube-api-access-8tvkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.253545 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "776888c0-31ec-4da6-833f-eb693b220b39" (UID: "776888c0-31ec-4da6-833f-eb693b220b39"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.253778 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "776888c0-31ec-4da6-833f-eb693b220b39" (UID: "776888c0-31ec-4da6-833f-eb693b220b39"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.255336 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "776888c0-31ec-4da6-833f-eb693b220b39" (UID: "776888c0-31ec-4da6-833f-eb693b220b39"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.264474 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "776888c0-31ec-4da6-833f-eb693b220b39" (UID: "776888c0-31ec-4da6-833f-eb693b220b39"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.270122 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-config" (OuterVolumeSpecName: "config") pod "776888c0-31ec-4da6-833f-eb693b220b39" (UID: "776888c0-31ec-4da6-833f-eb693b220b39"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.296554 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "776888c0-31ec-4da6-833f-eb693b220b39" (UID: "776888c0-31ec-4da6-833f-eb693b220b39"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.296950 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.296976 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tvkq\" (UniqueName: \"kubernetes.io/projected/776888c0-31ec-4da6-833f-eb693b220b39-kube-api-access-8tvkq\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.296990 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.297004 4760 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.297014 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.297025 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.297036 4760 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/776888c0-31ec-4da6-833f-eb693b220b39-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.756131 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" event={"ID":"776888c0-31ec-4da6-833f-eb693b220b39","Type":"ContainerDied","Data":"23d681a172e00a907dfafe06f431805a1d69f51e94a0cdd054438d82e5a80cfe"} Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.756385 4760 scope.go:117] "RemoveContainer" containerID="31964a842333505571a8b8bc29538421cab6081e8ec0f597f832b5cc650d5852" Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.756554 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-h8jc7" Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.792272 4760 scope.go:117] "RemoveContainer" containerID="2a57d3fb2154214236af93320d15c60d6f91b76a855069ba15e0e932e483c677" Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.798415 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-h8jc7"] Sep 30 07:54:39 crc kubenswrapper[4760]: I0930 07:54:39.807223 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-h8jc7"] Sep 30 07:54:41 crc kubenswrapper[4760]: I0930 07:54:41.088731 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="776888c0-31ec-4da6-833f-eb693b220b39" path="/var/lib/kubelet/pods/776888c0-31ec-4da6-833f-eb693b220b39/volumes" Sep 30 07:54:45 crc kubenswrapper[4760]: I0930 07:54:45.840352 4760 generic.go:334] "Generic (PLEG): container finished" podID="7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac" containerID="19e050112983eb55da6e785daba7de7c6cf9a59af4cd4c49cc450752e3b2b2f8" exitCode=0 Sep 30 07:54:45 crc kubenswrapper[4760]: I0930 07:54:45.840416 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac","Type":"ContainerDied","Data":"19e050112983eb55da6e785daba7de7c6cf9a59af4cd4c49cc450752e3b2b2f8"} Sep 30 07:54:46 crc kubenswrapper[4760]: I0930 07:54:46.862547 4760 generic.go:334] "Generic (PLEG): container finished" podID="2750016d-97a4-4e2b-a0e8-a03ddd6d64bb" containerID="7b94ed839ce69f2e063b82db1a98aaad8ce1fc6eeb358af912177c982043ee4c" exitCode=0 Sep 30 07:54:46 crc kubenswrapper[4760]: I0930 07:54:46.862644 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb","Type":"ContainerDied","Data":"7b94ed839ce69f2e063b82db1a98aaad8ce1fc6eeb358af912177c982043ee4c"} Sep 30 07:54:46 crc kubenswrapper[4760]: I0930 07:54:46.869935 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac","Type":"ContainerStarted","Data":"4706d7f9598e0fdb26f8fb747ef8926e3fd1e1f09fe1f3f9bff3042fc8f59061"} Sep 30 07:54:46 crc kubenswrapper[4760]: I0930 07:54:46.940927 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=33.940903154 podStartE2EDuration="33.940903154s" podCreationTimestamp="2025-09-30 07:54:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:54:46.930511718 +0000 UTC m=+1272.573418130" watchObservedRunningTime="2025-09-30 07:54:46.940903154 +0000 UTC m=+1272.583809566" Sep 30 07:54:47 crc kubenswrapper[4760]: I0930 07:54:47.883583 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2750016d-97a4-4e2b-a0e8-a03ddd6d64bb","Type":"ContainerStarted","Data":"9abc2ee32f8d708a52b99946cc4f989c2bdc4ea6dc0bd84c0d52abf14ee609bd"} Sep 30 07:54:47 crc kubenswrapper[4760]: I0930 07:54:47.885660 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:54:47 crc kubenswrapper[4760]: I0930 07:54:47.919893 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=33.919873734 podStartE2EDuration="33.919873734s" podCreationTimestamp="2025-09-30 07:54:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:54:47.913368748 +0000 UTC m=+1273.556275190" watchObservedRunningTime="2025-09-30 07:54:47.919873734 +0000 UTC m=+1273.562780146" Sep 30 07:54:49 crc kubenswrapper[4760]: I0930 07:54:49.112822 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:54:49 crc kubenswrapper[4760]: I0930 07:54:49.113214 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:54:53 crc kubenswrapper[4760]: I0930 07:54:53.853915 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 30 07:54:56 crc kubenswrapper[4760]: I0930 07:54:56.491922 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj"] Sep 30 07:54:56 crc kubenswrapper[4760]: E0930 07:54:56.492904 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="776888c0-31ec-4da6-833f-eb693b220b39" containerName="dnsmasq-dns" Sep 30 07:54:56 crc kubenswrapper[4760]: I0930 07:54:56.492920 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="776888c0-31ec-4da6-833f-eb693b220b39" containerName="dnsmasq-dns" Sep 30 07:54:56 crc kubenswrapper[4760]: E0930 07:54:56.492938 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d1309a-4ecf-430e-84ad-69622bb9d9a6" containerName="dnsmasq-dns" Sep 30 07:54:56 crc kubenswrapper[4760]: I0930 07:54:56.492947 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d1309a-4ecf-430e-84ad-69622bb9d9a6" containerName="dnsmasq-dns" Sep 30 07:54:56 crc kubenswrapper[4760]: E0930 07:54:56.492990 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d1309a-4ecf-430e-84ad-69622bb9d9a6" containerName="init" Sep 30 07:54:56 crc kubenswrapper[4760]: I0930 07:54:56.493000 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d1309a-4ecf-430e-84ad-69622bb9d9a6" containerName="init" Sep 30 07:54:56 crc kubenswrapper[4760]: E0930 07:54:56.493013 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="776888c0-31ec-4da6-833f-eb693b220b39" containerName="init" Sep 30 07:54:56 crc kubenswrapper[4760]: I0930 07:54:56.493021 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="776888c0-31ec-4da6-833f-eb693b220b39" containerName="init" Sep 30 07:54:56 crc kubenswrapper[4760]: I0930 07:54:56.493263 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="82d1309a-4ecf-430e-84ad-69622bb9d9a6" containerName="dnsmasq-dns" Sep 30 07:54:56 crc kubenswrapper[4760]: I0930 07:54:56.493277 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="776888c0-31ec-4da6-833f-eb693b220b39" containerName="dnsmasq-dns" Sep 30 07:54:56 crc kubenswrapper[4760]: I0930 07:54:56.494395 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj" Sep 30 07:54:56 crc kubenswrapper[4760]: I0930 07:54:56.496011 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 07:54:56 crc kubenswrapper[4760]: I0930 07:54:56.497835 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8gxrf" Sep 30 07:54:56 crc kubenswrapper[4760]: I0930 07:54:56.497836 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 07:54:56 crc kubenswrapper[4760]: I0930 07:54:56.498291 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 07:54:56 crc kubenswrapper[4760]: I0930 07:54:56.517843 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj"] Sep 30 07:54:56 crc kubenswrapper[4760]: I0930 07:54:56.649736 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnx4d\" (UniqueName: \"kubernetes.io/projected/65500975-80f6-4dae-a528-33950d370831-kube-api-access-qnx4d\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj\" (UID: \"65500975-80f6-4dae-a528-33950d370831\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj" Sep 30 07:54:56 crc kubenswrapper[4760]: I0930 07:54:56.649909 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65500975-80f6-4dae-a528-33950d370831-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj\" (UID: \"65500975-80f6-4dae-a528-33950d370831\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj" Sep 30 07:54:56 crc kubenswrapper[4760]: I0930 07:54:56.650146 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65500975-80f6-4dae-a528-33950d370831-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj\" (UID: \"65500975-80f6-4dae-a528-33950d370831\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj" Sep 30 07:54:56 crc kubenswrapper[4760]: I0930 07:54:56.650504 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65500975-80f6-4dae-a528-33950d370831-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj\" (UID: \"65500975-80f6-4dae-a528-33950d370831\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj" Sep 30 07:54:56 crc kubenswrapper[4760]: I0930 07:54:56.752607 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65500975-80f6-4dae-a528-33950d370831-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj\" (UID: \"65500975-80f6-4dae-a528-33950d370831\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj" Sep 30 07:54:56 crc kubenswrapper[4760]: I0930 07:54:56.752690 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65500975-80f6-4dae-a528-33950d370831-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj\" (UID: \"65500975-80f6-4dae-a528-33950d370831\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj" Sep 30 07:54:56 crc kubenswrapper[4760]: I0930 07:54:56.752772 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65500975-80f6-4dae-a528-33950d370831-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj\" (UID: \"65500975-80f6-4dae-a528-33950d370831\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj" Sep 30 07:54:56 crc kubenswrapper[4760]: I0930 07:54:56.752811 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnx4d\" (UniqueName: \"kubernetes.io/projected/65500975-80f6-4dae-a528-33950d370831-kube-api-access-qnx4d\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj\" (UID: \"65500975-80f6-4dae-a528-33950d370831\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj" Sep 30 07:54:56 crc kubenswrapper[4760]: I0930 07:54:56.762260 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65500975-80f6-4dae-a528-33950d370831-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj\" (UID: \"65500975-80f6-4dae-a528-33950d370831\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj" Sep 30 07:54:56 crc kubenswrapper[4760]: I0930 07:54:56.762547 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65500975-80f6-4dae-a528-33950d370831-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj\" (UID: \"65500975-80f6-4dae-a528-33950d370831\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj" Sep 30 07:54:56 crc kubenswrapper[4760]: I0930 07:54:56.762801 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65500975-80f6-4dae-a528-33950d370831-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj\" (UID: \"65500975-80f6-4dae-a528-33950d370831\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj" Sep 30 07:54:56 crc kubenswrapper[4760]: I0930 07:54:56.767890 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnx4d\" (UniqueName: \"kubernetes.io/projected/65500975-80f6-4dae-a528-33950d370831-kube-api-access-qnx4d\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj\" (UID: \"65500975-80f6-4dae-a528-33950d370831\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj" Sep 30 07:54:56 crc kubenswrapper[4760]: I0930 07:54:56.827060 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj" Sep 30 07:54:57 crc kubenswrapper[4760]: I0930 07:54:57.432771 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj"] Sep 30 07:54:57 crc kubenswrapper[4760]: I0930 07:54:57.996928 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj" event={"ID":"65500975-80f6-4dae-a528-33950d370831","Type":"ContainerStarted","Data":"7fc5cfeb70e37799a3e963736bfc0fb4f8a2ab8f3502d61e61b5c976ab1e9ea6"} Sep 30 07:55:03 crc kubenswrapper[4760]: I0930 07:55:03.857118 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 30 07:55:04 crc kubenswrapper[4760]: I0930 07:55:04.886538 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 30 07:55:07 crc kubenswrapper[4760]: I0930 07:55:07.122438 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj" event={"ID":"65500975-80f6-4dae-a528-33950d370831","Type":"ContainerStarted","Data":"bbdfcbef04fa812e245e4342346c021102e50562d7c5c5ce73e3014b3fcbb6be"} Sep 30 07:55:07 crc kubenswrapper[4760]: I0930 07:55:07.147607 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj" podStartSLOduration=2.066301583 podStartE2EDuration="11.14758575s" podCreationTimestamp="2025-09-30 07:54:56 +0000 UTC" firstStartedPulling="2025-09-30 07:54:57.443369903 +0000 UTC m=+1283.086276315" lastFinishedPulling="2025-09-30 07:55:06.52465407 +0000 UTC m=+1292.167560482" observedRunningTime="2025-09-30 07:55:07.138598901 +0000 UTC m=+1292.781505353" watchObservedRunningTime="2025-09-30 07:55:07.14758575 +0000 UTC m=+1292.790492162" Sep 30 07:55:18 crc kubenswrapper[4760]: I0930 07:55:18.257349 4760 generic.go:334] "Generic (PLEG): container finished" podID="65500975-80f6-4dae-a528-33950d370831" containerID="bbdfcbef04fa812e245e4342346c021102e50562d7c5c5ce73e3014b3fcbb6be" exitCode=0 Sep 30 07:55:18 crc kubenswrapper[4760]: I0930 07:55:18.257481 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj" event={"ID":"65500975-80f6-4dae-a528-33950d370831","Type":"ContainerDied","Data":"bbdfcbef04fa812e245e4342346c021102e50562d7c5c5ce73e3014b3fcbb6be"} Sep 30 07:55:19 crc kubenswrapper[4760]: I0930 07:55:19.113230 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:55:19 crc kubenswrapper[4760]: I0930 07:55:19.113497 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:55:19 crc kubenswrapper[4760]: I0930 07:55:19.113709 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 07:55:19 crc kubenswrapper[4760]: I0930 07:55:19.115067 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dd30dd4d28eef306568f8541bb8d83a7c0af086ec623d77ff729c59fba19ae20"} pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 07:55:19 crc kubenswrapper[4760]: I0930 07:55:19.115211 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" containerID="cri-o://dd30dd4d28eef306568f8541bb8d83a7c0af086ec623d77ff729c59fba19ae20" gracePeriod=600 Sep 30 07:55:19 crc kubenswrapper[4760]: I0930 07:55:19.284798 4760 generic.go:334] "Generic (PLEG): container finished" podID="7a9c8270-6964-4886-87d0-227b05b76da4" containerID="dd30dd4d28eef306568f8541bb8d83a7c0af086ec623d77ff729c59fba19ae20" exitCode=0 Sep 30 07:55:19 crc kubenswrapper[4760]: I0930 07:55:19.284864 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerDied","Data":"dd30dd4d28eef306568f8541bb8d83a7c0af086ec623d77ff729c59fba19ae20"} Sep 30 07:55:19 crc kubenswrapper[4760]: I0930 07:55:19.284919 4760 scope.go:117] "RemoveContainer" containerID="e08631326d3db4f9e31ecd2756775d73d9783f49875cc3f66b5e516f36754f34" Sep 30 07:55:19 crc kubenswrapper[4760]: I0930 07:55:19.703144 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj" Sep 30 07:55:19 crc kubenswrapper[4760]: I0930 07:55:19.746957 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnx4d\" (UniqueName: \"kubernetes.io/projected/65500975-80f6-4dae-a528-33950d370831-kube-api-access-qnx4d\") pod \"65500975-80f6-4dae-a528-33950d370831\" (UID: \"65500975-80f6-4dae-a528-33950d370831\") " Sep 30 07:55:19 crc kubenswrapper[4760]: I0930 07:55:19.747091 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65500975-80f6-4dae-a528-33950d370831-ssh-key\") pod \"65500975-80f6-4dae-a528-33950d370831\" (UID: \"65500975-80f6-4dae-a528-33950d370831\") " Sep 30 07:55:19 crc kubenswrapper[4760]: I0930 07:55:19.747123 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65500975-80f6-4dae-a528-33950d370831-inventory\") pod \"65500975-80f6-4dae-a528-33950d370831\" (UID: \"65500975-80f6-4dae-a528-33950d370831\") " Sep 30 07:55:19 crc kubenswrapper[4760]: I0930 07:55:19.747286 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65500975-80f6-4dae-a528-33950d370831-repo-setup-combined-ca-bundle\") pod \"65500975-80f6-4dae-a528-33950d370831\" (UID: \"65500975-80f6-4dae-a528-33950d370831\") " Sep 30 07:55:19 crc kubenswrapper[4760]: I0930 07:55:19.771238 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65500975-80f6-4dae-a528-33950d370831-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "65500975-80f6-4dae-a528-33950d370831" (UID: "65500975-80f6-4dae-a528-33950d370831"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:55:19 crc kubenswrapper[4760]: I0930 07:55:19.771318 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65500975-80f6-4dae-a528-33950d370831-kube-api-access-qnx4d" (OuterVolumeSpecName: "kube-api-access-qnx4d") pod "65500975-80f6-4dae-a528-33950d370831" (UID: "65500975-80f6-4dae-a528-33950d370831"). InnerVolumeSpecName "kube-api-access-qnx4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:55:19 crc kubenswrapper[4760]: I0930 07:55:19.779440 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65500975-80f6-4dae-a528-33950d370831-inventory" (OuterVolumeSpecName: "inventory") pod "65500975-80f6-4dae-a528-33950d370831" (UID: "65500975-80f6-4dae-a528-33950d370831"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:55:19 crc kubenswrapper[4760]: I0930 07:55:19.782966 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65500975-80f6-4dae-a528-33950d370831-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "65500975-80f6-4dae-a528-33950d370831" (UID: "65500975-80f6-4dae-a528-33950d370831"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:55:19 crc kubenswrapper[4760]: I0930 07:55:19.849695 4760 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65500975-80f6-4dae-a528-33950d370831-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:55:19 crc kubenswrapper[4760]: I0930 07:55:19.849738 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnx4d\" (UniqueName: \"kubernetes.io/projected/65500975-80f6-4dae-a528-33950d370831-kube-api-access-qnx4d\") on node \"crc\" DevicePath \"\"" Sep 30 07:55:19 crc kubenswrapper[4760]: I0930 07:55:19.849752 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65500975-80f6-4dae-a528-33950d370831-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 07:55:19 crc kubenswrapper[4760]: I0930 07:55:19.849765 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65500975-80f6-4dae-a528-33950d370831-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 07:55:20 crc kubenswrapper[4760]: I0930 07:55:20.301120 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerStarted","Data":"6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8"} Sep 30 07:55:20 crc kubenswrapper[4760]: I0930 07:55:20.310217 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj" event={"ID":"65500975-80f6-4dae-a528-33950d370831","Type":"ContainerDied","Data":"7fc5cfeb70e37799a3e963736bfc0fb4f8a2ab8f3502d61e61b5c976ab1e9ea6"} Sep 30 07:55:20 crc kubenswrapper[4760]: I0930 07:55:20.310271 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fc5cfeb70e37799a3e963736bfc0fb4f8a2ab8f3502d61e61b5c976ab1e9ea6" Sep 30 07:55:20 crc kubenswrapper[4760]: I0930 07:55:20.310381 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj" Sep 30 07:55:20 crc kubenswrapper[4760]: I0930 07:55:20.434196 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-wld9k"] Sep 30 07:55:20 crc kubenswrapper[4760]: E0930 07:55:20.434769 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65500975-80f6-4dae-a528-33950d370831" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 07:55:20 crc kubenswrapper[4760]: I0930 07:55:20.434793 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="65500975-80f6-4dae-a528-33950d370831" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 07:55:20 crc kubenswrapper[4760]: I0930 07:55:20.435045 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="65500975-80f6-4dae-a528-33950d370831" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 07:55:20 crc kubenswrapper[4760]: I0930 07:55:20.435944 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wld9k" Sep 30 07:55:20 crc kubenswrapper[4760]: I0930 07:55:20.438853 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8gxrf" Sep 30 07:55:20 crc kubenswrapper[4760]: I0930 07:55:20.438853 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 07:55:20 crc kubenswrapper[4760]: I0930 07:55:20.439215 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 07:55:20 crc kubenswrapper[4760]: I0930 07:55:20.439798 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 07:55:20 crc kubenswrapper[4760]: I0930 07:55:20.445617 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-wld9k"] Sep 30 07:55:20 crc kubenswrapper[4760]: I0930 07:55:20.566396 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm2gx\" (UniqueName: \"kubernetes.io/projected/36b213a9-6e12-4215-be85-b1a0c647558f-kube-api-access-qm2gx\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wld9k\" (UID: \"36b213a9-6e12-4215-be85-b1a0c647558f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wld9k" Sep 30 07:55:20 crc kubenswrapper[4760]: I0930 07:55:20.566471 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36b213a9-6e12-4215-be85-b1a0c647558f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wld9k\" (UID: \"36b213a9-6e12-4215-be85-b1a0c647558f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wld9k" Sep 30 07:55:20 crc kubenswrapper[4760]: I0930 07:55:20.566585 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36b213a9-6e12-4215-be85-b1a0c647558f-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wld9k\" (UID: \"36b213a9-6e12-4215-be85-b1a0c647558f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wld9k" Sep 30 07:55:20 crc kubenswrapper[4760]: I0930 07:55:20.668433 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36b213a9-6e12-4215-be85-b1a0c647558f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wld9k\" (UID: \"36b213a9-6e12-4215-be85-b1a0c647558f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wld9k" Sep 30 07:55:20 crc kubenswrapper[4760]: I0930 07:55:20.668629 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36b213a9-6e12-4215-be85-b1a0c647558f-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wld9k\" (UID: \"36b213a9-6e12-4215-be85-b1a0c647558f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wld9k" Sep 30 07:55:20 crc kubenswrapper[4760]: I0930 07:55:20.668796 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm2gx\" (UniqueName: \"kubernetes.io/projected/36b213a9-6e12-4215-be85-b1a0c647558f-kube-api-access-qm2gx\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wld9k\" (UID: \"36b213a9-6e12-4215-be85-b1a0c647558f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wld9k" Sep 30 07:55:20 crc kubenswrapper[4760]: I0930 07:55:20.678582 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36b213a9-6e12-4215-be85-b1a0c647558f-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wld9k\" (UID: \"36b213a9-6e12-4215-be85-b1a0c647558f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wld9k" Sep 30 07:55:20 crc kubenswrapper[4760]: I0930 07:55:20.678865 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36b213a9-6e12-4215-be85-b1a0c647558f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wld9k\" (UID: \"36b213a9-6e12-4215-be85-b1a0c647558f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wld9k" Sep 30 07:55:20 crc kubenswrapper[4760]: I0930 07:55:20.696138 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm2gx\" (UniqueName: \"kubernetes.io/projected/36b213a9-6e12-4215-be85-b1a0c647558f-kube-api-access-qm2gx\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wld9k\" (UID: \"36b213a9-6e12-4215-be85-b1a0c647558f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wld9k" Sep 30 07:55:20 crc kubenswrapper[4760]: I0930 07:55:20.754930 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wld9k" Sep 30 07:55:21 crc kubenswrapper[4760]: I0930 07:55:21.379648 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-wld9k"] Sep 30 07:55:22 crc kubenswrapper[4760]: I0930 07:55:22.356926 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wld9k" event={"ID":"36b213a9-6e12-4215-be85-b1a0c647558f","Type":"ContainerStarted","Data":"ef764fa929d8bcf18741df9265c776107666e5be0a5edef76d0c8197461c70c6"} Sep 30 07:55:22 crc kubenswrapper[4760]: I0930 07:55:22.358689 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wld9k" event={"ID":"36b213a9-6e12-4215-be85-b1a0c647558f","Type":"ContainerStarted","Data":"349f9b285d1d009749da80d4ef460c1f6f095121d3d97a67b6cef33d6710d0cb"} Sep 30 07:55:22 crc kubenswrapper[4760]: I0930 07:55:22.387829 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wld9k" podStartSLOduration=1.910080249 podStartE2EDuration="2.387807293s" podCreationTimestamp="2025-09-30 07:55:20 +0000 UTC" firstStartedPulling="2025-09-30 07:55:21.373211903 +0000 UTC m=+1307.016118315" lastFinishedPulling="2025-09-30 07:55:21.850938907 +0000 UTC m=+1307.493845359" observedRunningTime="2025-09-30 07:55:22.383916354 +0000 UTC m=+1308.026822776" watchObservedRunningTime="2025-09-30 07:55:22.387807293 +0000 UTC m=+1308.030713725" Sep 30 07:55:25 crc kubenswrapper[4760]: I0930 07:55:25.390235 4760 generic.go:334] "Generic (PLEG): container finished" podID="36b213a9-6e12-4215-be85-b1a0c647558f" containerID="ef764fa929d8bcf18741df9265c776107666e5be0a5edef76d0c8197461c70c6" exitCode=0 Sep 30 07:55:25 crc kubenswrapper[4760]: I0930 07:55:25.390423 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wld9k" event={"ID":"36b213a9-6e12-4215-be85-b1a0c647558f","Type":"ContainerDied","Data":"ef764fa929d8bcf18741df9265c776107666e5be0a5edef76d0c8197461c70c6"} Sep 30 07:55:26 crc kubenswrapper[4760]: I0930 07:55:26.846087 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wld9k" Sep 30 07:55:26 crc kubenswrapper[4760]: I0930 07:55:26.907261 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm2gx\" (UniqueName: \"kubernetes.io/projected/36b213a9-6e12-4215-be85-b1a0c647558f-kube-api-access-qm2gx\") pod \"36b213a9-6e12-4215-be85-b1a0c647558f\" (UID: \"36b213a9-6e12-4215-be85-b1a0c647558f\") " Sep 30 07:55:26 crc kubenswrapper[4760]: I0930 07:55:26.907404 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36b213a9-6e12-4215-be85-b1a0c647558f-ssh-key\") pod \"36b213a9-6e12-4215-be85-b1a0c647558f\" (UID: \"36b213a9-6e12-4215-be85-b1a0c647558f\") " Sep 30 07:55:26 crc kubenswrapper[4760]: I0930 07:55:26.907466 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36b213a9-6e12-4215-be85-b1a0c647558f-inventory\") pod \"36b213a9-6e12-4215-be85-b1a0c647558f\" (UID: \"36b213a9-6e12-4215-be85-b1a0c647558f\") " Sep 30 07:55:26 crc kubenswrapper[4760]: I0930 07:55:26.915574 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36b213a9-6e12-4215-be85-b1a0c647558f-kube-api-access-qm2gx" (OuterVolumeSpecName: "kube-api-access-qm2gx") pod "36b213a9-6e12-4215-be85-b1a0c647558f" (UID: "36b213a9-6e12-4215-be85-b1a0c647558f"). InnerVolumeSpecName "kube-api-access-qm2gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:55:26 crc kubenswrapper[4760]: I0930 07:55:26.940504 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36b213a9-6e12-4215-be85-b1a0c647558f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "36b213a9-6e12-4215-be85-b1a0c647558f" (UID: "36b213a9-6e12-4215-be85-b1a0c647558f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:55:26 crc kubenswrapper[4760]: I0930 07:55:26.961075 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36b213a9-6e12-4215-be85-b1a0c647558f-inventory" (OuterVolumeSpecName: "inventory") pod "36b213a9-6e12-4215-be85-b1a0c647558f" (UID: "36b213a9-6e12-4215-be85-b1a0c647558f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.011199 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm2gx\" (UniqueName: \"kubernetes.io/projected/36b213a9-6e12-4215-be85-b1a0c647558f-kube-api-access-qm2gx\") on node \"crc\" DevicePath \"\"" Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.011250 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36b213a9-6e12-4215-be85-b1a0c647558f-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.011272 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36b213a9-6e12-4215-be85-b1a0c647558f-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.412568 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wld9k" event={"ID":"36b213a9-6e12-4215-be85-b1a0c647558f","Type":"ContainerDied","Data":"349f9b285d1d009749da80d4ef460c1f6f095121d3d97a67b6cef33d6710d0cb"} Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.412960 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="349f9b285d1d009749da80d4ef460c1f6f095121d3d97a67b6cef33d6710d0cb" Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.412660 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wld9k" Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.482248 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6"] Sep 30 07:55:27 crc kubenswrapper[4760]: E0930 07:55:27.482732 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36b213a9-6e12-4215-be85-b1a0c647558f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.482756 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="36b213a9-6e12-4215-be85-b1a0c647558f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.483053 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="36b213a9-6e12-4215-be85-b1a0c647558f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.483892 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6" Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.491933 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.492402 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8gxrf" Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.492425 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.492912 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.493986 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6"] Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.525505 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64e019eb-1763-4e9e-8c00-c4312d782981-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6\" (UID: \"64e019eb-1763-4e9e-8c00-c4312d782981\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6" Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.525566 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64e019eb-1763-4e9e-8c00-c4312d782981-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6\" (UID: \"64e019eb-1763-4e9e-8c00-c4312d782981\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6" Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.525639 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/64e019eb-1763-4e9e-8c00-c4312d782981-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6\" (UID: \"64e019eb-1763-4e9e-8c00-c4312d782981\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6" Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.525677 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gcbx\" (UniqueName: \"kubernetes.io/projected/64e019eb-1763-4e9e-8c00-c4312d782981-kube-api-access-2gcbx\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6\" (UID: \"64e019eb-1763-4e9e-8c00-c4312d782981\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6" Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.628243 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gcbx\" (UniqueName: \"kubernetes.io/projected/64e019eb-1763-4e9e-8c00-c4312d782981-kube-api-access-2gcbx\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6\" (UID: \"64e019eb-1763-4e9e-8c00-c4312d782981\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6" Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.628504 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64e019eb-1763-4e9e-8c00-c4312d782981-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6\" (UID: \"64e019eb-1763-4e9e-8c00-c4312d782981\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6" Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.628577 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64e019eb-1763-4e9e-8c00-c4312d782981-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6\" (UID: \"64e019eb-1763-4e9e-8c00-c4312d782981\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6" Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.628832 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/64e019eb-1763-4e9e-8c00-c4312d782981-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6\" (UID: \"64e019eb-1763-4e9e-8c00-c4312d782981\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6" Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.633511 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/64e019eb-1763-4e9e-8c00-c4312d782981-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6\" (UID: \"64e019eb-1763-4e9e-8c00-c4312d782981\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6" Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.633703 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64e019eb-1763-4e9e-8c00-c4312d782981-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6\" (UID: \"64e019eb-1763-4e9e-8c00-c4312d782981\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6" Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.633857 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64e019eb-1763-4e9e-8c00-c4312d782981-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6\" (UID: \"64e019eb-1763-4e9e-8c00-c4312d782981\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6" Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.646410 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gcbx\" (UniqueName: \"kubernetes.io/projected/64e019eb-1763-4e9e-8c00-c4312d782981-kube-api-access-2gcbx\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6\" (UID: \"64e019eb-1763-4e9e-8c00-c4312d782981\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6" Sep 30 07:55:27 crc kubenswrapper[4760]: I0930 07:55:27.801375 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6" Sep 30 07:55:28 crc kubenswrapper[4760]: I0930 07:55:28.350356 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6"] Sep 30 07:55:28 crc kubenswrapper[4760]: I0930 07:55:28.425407 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6" event={"ID":"64e019eb-1763-4e9e-8c00-c4312d782981","Type":"ContainerStarted","Data":"b46223e1e52c22fcff461a7c5bac7cbaed4b8217a453bfb76065fb7063e10408"} Sep 30 07:55:29 crc kubenswrapper[4760]: I0930 07:55:29.438570 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6" event={"ID":"64e019eb-1763-4e9e-8c00-c4312d782981","Type":"ContainerStarted","Data":"8a24a09c17ff3ec08bd0602228486459b7734168ba81f82d5aeccfa7c73016f9"} Sep 30 07:55:29 crc kubenswrapper[4760]: I0930 07:55:29.459290 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6" podStartSLOduration=2.054484942 podStartE2EDuration="2.459261122s" podCreationTimestamp="2025-09-30 07:55:27 +0000 UTC" firstStartedPulling="2025-09-30 07:55:28.347062219 +0000 UTC m=+1313.989968641" lastFinishedPulling="2025-09-30 07:55:28.751838409 +0000 UTC m=+1314.394744821" observedRunningTime="2025-09-30 07:55:29.457318452 +0000 UTC m=+1315.100224874" watchObservedRunningTime="2025-09-30 07:55:29.459261122 +0000 UTC m=+1315.102167564" Sep 30 07:55:42 crc kubenswrapper[4760]: I0930 07:55:42.998378 4760 scope.go:117] "RemoveContainer" containerID="48be93d0fea046f3d8ada9837f967d2d4455be9eb009f74eef391a84dce05a7a" Sep 30 07:55:43 crc kubenswrapper[4760]: I0930 07:55:43.052384 4760 scope.go:117] "RemoveContainer" containerID="b2bb2029864022dfad325772a095e2328e30a0a3c9ffd58158203cedd2b8bef6" Sep 30 07:56:43 crc kubenswrapper[4760]: I0930 07:56:43.173960 4760 scope.go:117] "RemoveContainer" containerID="36d6105918e0e3dba49282bea8e38b9720ce0cff8c55ddd63be3ab273613a242" Sep 30 07:56:43 crc kubenswrapper[4760]: I0930 07:56:43.216546 4760 scope.go:117] "RemoveContainer" containerID="f80d5f35ff3c46d2c909cd75de71a5cd1d0f5af795b416a56386816baed4e83a" Sep 30 07:56:43 crc kubenswrapper[4760]: I0930 07:56:43.290112 4760 scope.go:117] "RemoveContainer" containerID="6ec9d5a22f841fdef7de549baa0a72a7a08951cc585d48d88bdbfc47d7ac96e1" Sep 30 07:57:02 crc kubenswrapper[4760]: I0930 07:57:02.523220 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wm5td"] Sep 30 07:57:02 crc kubenswrapper[4760]: I0930 07:57:02.526387 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wm5td" Sep 30 07:57:02 crc kubenswrapper[4760]: I0930 07:57:02.551533 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wm5td"] Sep 30 07:57:02 crc kubenswrapper[4760]: I0930 07:57:02.628955 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4-utilities\") pod \"community-operators-wm5td\" (UID: \"41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4\") " pod="openshift-marketplace/community-operators-wm5td" Sep 30 07:57:02 crc kubenswrapper[4760]: I0930 07:57:02.629044 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4-catalog-content\") pod \"community-operators-wm5td\" (UID: \"41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4\") " pod="openshift-marketplace/community-operators-wm5td" Sep 30 07:57:02 crc kubenswrapper[4760]: I0930 07:57:02.629343 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4bcw\" (UniqueName: \"kubernetes.io/projected/41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4-kube-api-access-m4bcw\") pod \"community-operators-wm5td\" (UID: \"41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4\") " pod="openshift-marketplace/community-operators-wm5td" Sep 30 07:57:02 crc kubenswrapper[4760]: I0930 07:57:02.731591 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4-utilities\") pod \"community-operators-wm5td\" (UID: \"41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4\") " pod="openshift-marketplace/community-operators-wm5td" Sep 30 07:57:02 crc kubenswrapper[4760]: I0930 07:57:02.731667 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4-catalog-content\") pod \"community-operators-wm5td\" (UID: \"41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4\") " pod="openshift-marketplace/community-operators-wm5td" Sep 30 07:57:02 crc kubenswrapper[4760]: I0930 07:57:02.731808 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4bcw\" (UniqueName: \"kubernetes.io/projected/41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4-kube-api-access-m4bcw\") pod \"community-operators-wm5td\" (UID: \"41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4\") " pod="openshift-marketplace/community-operators-wm5td" Sep 30 07:57:02 crc kubenswrapper[4760]: I0930 07:57:02.732108 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4-utilities\") pod \"community-operators-wm5td\" (UID: \"41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4\") " pod="openshift-marketplace/community-operators-wm5td" Sep 30 07:57:02 crc kubenswrapper[4760]: I0930 07:57:02.732337 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4-catalog-content\") pod \"community-operators-wm5td\" (UID: \"41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4\") " pod="openshift-marketplace/community-operators-wm5td" Sep 30 07:57:02 crc kubenswrapper[4760]: I0930 07:57:02.758268 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4bcw\" (UniqueName: \"kubernetes.io/projected/41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4-kube-api-access-m4bcw\") pod \"community-operators-wm5td\" (UID: \"41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4\") " pod="openshift-marketplace/community-operators-wm5td" Sep 30 07:57:02 crc kubenswrapper[4760]: I0930 07:57:02.865166 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wm5td" Sep 30 07:57:03 crc kubenswrapper[4760]: I0930 07:57:03.333129 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wm5td"] Sep 30 07:57:03 crc kubenswrapper[4760]: I0930 07:57:03.588954 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wm5td" event={"ID":"41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4","Type":"ContainerStarted","Data":"7af22bef7214b153cd77c8ef8660265a441c8d8971e434b14c57495013fbc2e7"} Sep 30 07:57:03 crc kubenswrapper[4760]: I0930 07:57:03.589252 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wm5td" event={"ID":"41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4","Type":"ContainerStarted","Data":"57340137d430070922e62a8b9549694e343a765a347d09d7c6f438fcba2f19fd"} Sep 30 07:57:04 crc kubenswrapper[4760]: I0930 07:57:04.599294 4760 generic.go:334] "Generic (PLEG): container finished" podID="41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4" containerID="7af22bef7214b153cd77c8ef8660265a441c8d8971e434b14c57495013fbc2e7" exitCode=0 Sep 30 07:57:04 crc kubenswrapper[4760]: I0930 07:57:04.599354 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wm5td" event={"ID":"41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4","Type":"ContainerDied","Data":"7af22bef7214b153cd77c8ef8660265a441c8d8971e434b14c57495013fbc2e7"} Sep 30 07:57:05 crc kubenswrapper[4760]: I0930 07:57:05.612446 4760 generic.go:334] "Generic (PLEG): container finished" podID="41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4" containerID="b0f9009d4dd6082f146cda9e759fb2212d33456c903ccb73c40f6dbfd3b48cc2" exitCode=0 Sep 30 07:57:05 crc kubenswrapper[4760]: I0930 07:57:05.612532 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wm5td" event={"ID":"41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4","Type":"ContainerDied","Data":"b0f9009d4dd6082f146cda9e759fb2212d33456c903ccb73c40f6dbfd3b48cc2"} Sep 30 07:57:06 crc kubenswrapper[4760]: I0930 07:57:06.628541 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wm5td" event={"ID":"41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4","Type":"ContainerStarted","Data":"1fd3827d3f6c19ac19420d2533212ddd09e85819645b7ca87e2d3c29d23cde5d"} Sep 30 07:57:06 crc kubenswrapper[4760]: I0930 07:57:06.661198 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wm5td" podStartSLOduration=2.23022837 podStartE2EDuration="4.661178854s" podCreationTimestamp="2025-09-30 07:57:02 +0000 UTC" firstStartedPulling="2025-09-30 07:57:03.590564088 +0000 UTC m=+1409.233470500" lastFinishedPulling="2025-09-30 07:57:06.021514562 +0000 UTC m=+1411.664420984" observedRunningTime="2025-09-30 07:57:06.650029909 +0000 UTC m=+1412.292936331" watchObservedRunningTime="2025-09-30 07:57:06.661178854 +0000 UTC m=+1412.304085266" Sep 30 07:57:12 crc kubenswrapper[4760]: I0930 07:57:12.865472 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wm5td" Sep 30 07:57:12 crc kubenswrapper[4760]: I0930 07:57:12.866518 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wm5td" Sep 30 07:57:12 crc kubenswrapper[4760]: I0930 07:57:12.931739 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wm5td" Sep 30 07:57:13 crc kubenswrapper[4760]: I0930 07:57:13.770486 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wm5td" Sep 30 07:57:13 crc kubenswrapper[4760]: I0930 07:57:13.854837 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wm5td"] Sep 30 07:57:15 crc kubenswrapper[4760]: I0930 07:57:15.723884 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wm5td" podUID="41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4" containerName="registry-server" containerID="cri-o://1fd3827d3f6c19ac19420d2533212ddd09e85819645b7ca87e2d3c29d23cde5d" gracePeriod=2 Sep 30 07:57:16 crc kubenswrapper[4760]: I0930 07:57:16.207795 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wm5td" Sep 30 07:57:16 crc kubenswrapper[4760]: I0930 07:57:16.313132 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4bcw\" (UniqueName: \"kubernetes.io/projected/41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4-kube-api-access-m4bcw\") pod \"41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4\" (UID: \"41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4\") " Sep 30 07:57:16 crc kubenswrapper[4760]: I0930 07:57:16.313974 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4-catalog-content\") pod \"41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4\" (UID: \"41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4\") " Sep 30 07:57:16 crc kubenswrapper[4760]: I0930 07:57:16.314048 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4-utilities\") pod \"41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4\" (UID: \"41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4\") " Sep 30 07:57:16 crc kubenswrapper[4760]: I0930 07:57:16.314913 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4-utilities" (OuterVolumeSpecName: "utilities") pod "41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4" (UID: "41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:57:16 crc kubenswrapper[4760]: I0930 07:57:16.321336 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4-kube-api-access-m4bcw" (OuterVolumeSpecName: "kube-api-access-m4bcw") pod "41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4" (UID: "41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4"). InnerVolumeSpecName "kube-api-access-m4bcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:57:16 crc kubenswrapper[4760]: I0930 07:57:16.394590 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4" (UID: "41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:57:16 crc kubenswrapper[4760]: I0930 07:57:16.417649 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:57:16 crc kubenswrapper[4760]: I0930 07:57:16.417722 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4bcw\" (UniqueName: \"kubernetes.io/projected/41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4-kube-api-access-m4bcw\") on node \"crc\" DevicePath \"\"" Sep 30 07:57:16 crc kubenswrapper[4760]: I0930 07:57:16.417920 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:57:16 crc kubenswrapper[4760]: I0930 07:57:16.736169 4760 generic.go:334] "Generic (PLEG): container finished" podID="41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4" containerID="1fd3827d3f6c19ac19420d2533212ddd09e85819645b7ca87e2d3c29d23cde5d" exitCode=0 Sep 30 07:57:16 crc kubenswrapper[4760]: I0930 07:57:16.736212 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wm5td" event={"ID":"41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4","Type":"ContainerDied","Data":"1fd3827d3f6c19ac19420d2533212ddd09e85819645b7ca87e2d3c29d23cde5d"} Sep 30 07:57:16 crc kubenswrapper[4760]: I0930 07:57:16.736268 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wm5td" Sep 30 07:57:16 crc kubenswrapper[4760]: I0930 07:57:16.736317 4760 scope.go:117] "RemoveContainer" containerID="1fd3827d3f6c19ac19420d2533212ddd09e85819645b7ca87e2d3c29d23cde5d" Sep 30 07:57:16 crc kubenswrapper[4760]: I0930 07:57:16.736284 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wm5td" event={"ID":"41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4","Type":"ContainerDied","Data":"57340137d430070922e62a8b9549694e343a765a347d09d7c6f438fcba2f19fd"} Sep 30 07:57:16 crc kubenswrapper[4760]: I0930 07:57:16.772127 4760 scope.go:117] "RemoveContainer" containerID="b0f9009d4dd6082f146cda9e759fb2212d33456c903ccb73c40f6dbfd3b48cc2" Sep 30 07:57:16 crc kubenswrapper[4760]: I0930 07:57:16.795668 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wm5td"] Sep 30 07:57:16 crc kubenswrapper[4760]: I0930 07:57:16.796771 4760 scope.go:117] "RemoveContainer" containerID="7af22bef7214b153cd77c8ef8660265a441c8d8971e434b14c57495013fbc2e7" Sep 30 07:57:16 crc kubenswrapper[4760]: I0930 07:57:16.809240 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wm5td"] Sep 30 07:57:16 crc kubenswrapper[4760]: I0930 07:57:16.858523 4760 scope.go:117] "RemoveContainer" containerID="1fd3827d3f6c19ac19420d2533212ddd09e85819645b7ca87e2d3c29d23cde5d" Sep 30 07:57:16 crc kubenswrapper[4760]: E0930 07:57:16.858944 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fd3827d3f6c19ac19420d2533212ddd09e85819645b7ca87e2d3c29d23cde5d\": container with ID starting with 1fd3827d3f6c19ac19420d2533212ddd09e85819645b7ca87e2d3c29d23cde5d not found: ID does not exist" containerID="1fd3827d3f6c19ac19420d2533212ddd09e85819645b7ca87e2d3c29d23cde5d" Sep 30 07:57:16 crc kubenswrapper[4760]: I0930 07:57:16.858987 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fd3827d3f6c19ac19420d2533212ddd09e85819645b7ca87e2d3c29d23cde5d"} err="failed to get container status \"1fd3827d3f6c19ac19420d2533212ddd09e85819645b7ca87e2d3c29d23cde5d\": rpc error: code = NotFound desc = could not find container \"1fd3827d3f6c19ac19420d2533212ddd09e85819645b7ca87e2d3c29d23cde5d\": container with ID starting with 1fd3827d3f6c19ac19420d2533212ddd09e85819645b7ca87e2d3c29d23cde5d not found: ID does not exist" Sep 30 07:57:16 crc kubenswrapper[4760]: I0930 07:57:16.859014 4760 scope.go:117] "RemoveContainer" containerID="b0f9009d4dd6082f146cda9e759fb2212d33456c903ccb73c40f6dbfd3b48cc2" Sep 30 07:57:16 crc kubenswrapper[4760]: E0930 07:57:16.859316 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0f9009d4dd6082f146cda9e759fb2212d33456c903ccb73c40f6dbfd3b48cc2\": container with ID starting with b0f9009d4dd6082f146cda9e759fb2212d33456c903ccb73c40f6dbfd3b48cc2 not found: ID does not exist" containerID="b0f9009d4dd6082f146cda9e759fb2212d33456c903ccb73c40f6dbfd3b48cc2" Sep 30 07:57:16 crc kubenswrapper[4760]: I0930 07:57:16.859346 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0f9009d4dd6082f146cda9e759fb2212d33456c903ccb73c40f6dbfd3b48cc2"} err="failed to get container status \"b0f9009d4dd6082f146cda9e759fb2212d33456c903ccb73c40f6dbfd3b48cc2\": rpc error: code = NotFound desc = could not find container \"b0f9009d4dd6082f146cda9e759fb2212d33456c903ccb73c40f6dbfd3b48cc2\": container with ID starting with b0f9009d4dd6082f146cda9e759fb2212d33456c903ccb73c40f6dbfd3b48cc2 not found: ID does not exist" Sep 30 07:57:16 crc kubenswrapper[4760]: I0930 07:57:16.859373 4760 scope.go:117] "RemoveContainer" containerID="7af22bef7214b153cd77c8ef8660265a441c8d8971e434b14c57495013fbc2e7" Sep 30 07:57:16 crc kubenswrapper[4760]: E0930 07:57:16.859598 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7af22bef7214b153cd77c8ef8660265a441c8d8971e434b14c57495013fbc2e7\": container with ID starting with 7af22bef7214b153cd77c8ef8660265a441c8d8971e434b14c57495013fbc2e7 not found: ID does not exist" containerID="7af22bef7214b153cd77c8ef8660265a441c8d8971e434b14c57495013fbc2e7" Sep 30 07:57:16 crc kubenswrapper[4760]: I0930 07:57:16.859625 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7af22bef7214b153cd77c8ef8660265a441c8d8971e434b14c57495013fbc2e7"} err="failed to get container status \"7af22bef7214b153cd77c8ef8660265a441c8d8971e434b14c57495013fbc2e7\": rpc error: code = NotFound desc = could not find container \"7af22bef7214b153cd77c8ef8660265a441c8d8971e434b14c57495013fbc2e7\": container with ID starting with 7af22bef7214b153cd77c8ef8660265a441c8d8971e434b14c57495013fbc2e7 not found: ID does not exist" Sep 30 07:57:17 crc kubenswrapper[4760]: I0930 07:57:17.076679 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4" path="/var/lib/kubelet/pods/41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4/volumes" Sep 30 07:57:19 crc kubenswrapper[4760]: I0930 07:57:19.113269 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:57:19 crc kubenswrapper[4760]: I0930 07:57:19.113762 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:57:35 crc kubenswrapper[4760]: I0930 07:57:35.742592 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vd2zk"] Sep 30 07:57:35 crc kubenswrapper[4760]: E0930 07:57:35.744261 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4" containerName="registry-server" Sep 30 07:57:35 crc kubenswrapper[4760]: I0930 07:57:35.744294 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4" containerName="registry-server" Sep 30 07:57:35 crc kubenswrapper[4760]: E0930 07:57:35.744413 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4" containerName="extract-utilities" Sep 30 07:57:35 crc kubenswrapper[4760]: I0930 07:57:35.744434 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4" containerName="extract-utilities" Sep 30 07:57:35 crc kubenswrapper[4760]: E0930 07:57:35.744463 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4" containerName="extract-content" Sep 30 07:57:35 crc kubenswrapper[4760]: I0930 07:57:35.744482 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4" containerName="extract-content" Sep 30 07:57:35 crc kubenswrapper[4760]: I0930 07:57:35.745050 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="41d1c3fd-9820-46b9-9dc1-45fc6d1a66f4" containerName="registry-server" Sep 30 07:57:35 crc kubenswrapper[4760]: I0930 07:57:35.749092 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vd2zk" Sep 30 07:57:35 crc kubenswrapper[4760]: I0930 07:57:35.757936 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vd2zk"] Sep 30 07:57:35 crc kubenswrapper[4760]: I0930 07:57:35.881488 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a24deab6-2aa5-4b18-9b88-a3bae3493049-utilities\") pod \"certified-operators-vd2zk\" (UID: \"a24deab6-2aa5-4b18-9b88-a3bae3493049\") " pod="openshift-marketplace/certified-operators-vd2zk" Sep 30 07:57:35 crc kubenswrapper[4760]: I0930 07:57:35.881592 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4jc6\" (UniqueName: \"kubernetes.io/projected/a24deab6-2aa5-4b18-9b88-a3bae3493049-kube-api-access-w4jc6\") pod \"certified-operators-vd2zk\" (UID: \"a24deab6-2aa5-4b18-9b88-a3bae3493049\") " pod="openshift-marketplace/certified-operators-vd2zk" Sep 30 07:57:35 crc kubenswrapper[4760]: I0930 07:57:35.881733 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a24deab6-2aa5-4b18-9b88-a3bae3493049-catalog-content\") pod \"certified-operators-vd2zk\" (UID: \"a24deab6-2aa5-4b18-9b88-a3bae3493049\") " pod="openshift-marketplace/certified-operators-vd2zk" Sep 30 07:57:35 crc kubenswrapper[4760]: I0930 07:57:35.983737 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a24deab6-2aa5-4b18-9b88-a3bae3493049-utilities\") pod \"certified-operators-vd2zk\" (UID: \"a24deab6-2aa5-4b18-9b88-a3bae3493049\") " pod="openshift-marketplace/certified-operators-vd2zk" Sep 30 07:57:35 crc kubenswrapper[4760]: I0930 07:57:35.983814 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4jc6\" (UniqueName: \"kubernetes.io/projected/a24deab6-2aa5-4b18-9b88-a3bae3493049-kube-api-access-w4jc6\") pod \"certified-operators-vd2zk\" (UID: \"a24deab6-2aa5-4b18-9b88-a3bae3493049\") " pod="openshift-marketplace/certified-operators-vd2zk" Sep 30 07:57:35 crc kubenswrapper[4760]: I0930 07:57:35.983883 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a24deab6-2aa5-4b18-9b88-a3bae3493049-catalog-content\") pod \"certified-operators-vd2zk\" (UID: \"a24deab6-2aa5-4b18-9b88-a3bae3493049\") " pod="openshift-marketplace/certified-operators-vd2zk" Sep 30 07:57:35 crc kubenswrapper[4760]: I0930 07:57:35.984779 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a24deab6-2aa5-4b18-9b88-a3bae3493049-utilities\") pod \"certified-operators-vd2zk\" (UID: \"a24deab6-2aa5-4b18-9b88-a3bae3493049\") " pod="openshift-marketplace/certified-operators-vd2zk" Sep 30 07:57:35 crc kubenswrapper[4760]: I0930 07:57:35.985744 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a24deab6-2aa5-4b18-9b88-a3bae3493049-catalog-content\") pod \"certified-operators-vd2zk\" (UID: \"a24deab6-2aa5-4b18-9b88-a3bae3493049\") " pod="openshift-marketplace/certified-operators-vd2zk" Sep 30 07:57:36 crc kubenswrapper[4760]: I0930 07:57:36.003250 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4jc6\" (UniqueName: \"kubernetes.io/projected/a24deab6-2aa5-4b18-9b88-a3bae3493049-kube-api-access-w4jc6\") pod \"certified-operators-vd2zk\" (UID: \"a24deab6-2aa5-4b18-9b88-a3bae3493049\") " pod="openshift-marketplace/certified-operators-vd2zk" Sep 30 07:57:36 crc kubenswrapper[4760]: I0930 07:57:36.085075 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vd2zk" Sep 30 07:57:36 crc kubenswrapper[4760]: I0930 07:57:36.555542 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vd2zk"] Sep 30 07:57:36 crc kubenswrapper[4760]: I0930 07:57:36.985500 4760 generic.go:334] "Generic (PLEG): container finished" podID="a24deab6-2aa5-4b18-9b88-a3bae3493049" containerID="4b22a77d496113b7fb4acd1010f0814849f2500ee77018624834c994b9cb5e92" exitCode=0 Sep 30 07:57:36 crc kubenswrapper[4760]: I0930 07:57:36.985548 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vd2zk" event={"ID":"a24deab6-2aa5-4b18-9b88-a3bae3493049","Type":"ContainerDied","Data":"4b22a77d496113b7fb4acd1010f0814849f2500ee77018624834c994b9cb5e92"} Sep 30 07:57:36 crc kubenswrapper[4760]: I0930 07:57:36.985614 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vd2zk" event={"ID":"a24deab6-2aa5-4b18-9b88-a3bae3493049","Type":"ContainerStarted","Data":"cbdc73d038827f7799c7c003dc6fb19928c248174d007f567a5b607f8129e09e"} Sep 30 07:57:37 crc kubenswrapper[4760]: I0930 07:57:37.996213 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vd2zk" event={"ID":"a24deab6-2aa5-4b18-9b88-a3bae3493049","Type":"ContainerStarted","Data":"9e399a86d5b8362501586a32f24ceb4e46fcc67382a64fa7e4cbeeeac1a54477"} Sep 30 07:57:38 crc kubenswrapper[4760]: I0930 07:57:38.518430 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pplp4"] Sep 30 07:57:38 crc kubenswrapper[4760]: I0930 07:57:38.521950 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pplp4" Sep 30 07:57:38 crc kubenswrapper[4760]: I0930 07:57:38.558863 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pplp4"] Sep 30 07:57:38 crc kubenswrapper[4760]: I0930 07:57:38.650805 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b53f9fb0-ea06-47bb-8f52-5ea569c4ec31-utilities\") pod \"redhat-marketplace-pplp4\" (UID: \"b53f9fb0-ea06-47bb-8f52-5ea569c4ec31\") " pod="openshift-marketplace/redhat-marketplace-pplp4" Sep 30 07:57:38 crc kubenswrapper[4760]: I0930 07:57:38.651034 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgzgt\" (UniqueName: \"kubernetes.io/projected/b53f9fb0-ea06-47bb-8f52-5ea569c4ec31-kube-api-access-dgzgt\") pod \"redhat-marketplace-pplp4\" (UID: \"b53f9fb0-ea06-47bb-8f52-5ea569c4ec31\") " pod="openshift-marketplace/redhat-marketplace-pplp4" Sep 30 07:57:38 crc kubenswrapper[4760]: I0930 07:57:38.651112 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b53f9fb0-ea06-47bb-8f52-5ea569c4ec31-catalog-content\") pod \"redhat-marketplace-pplp4\" (UID: \"b53f9fb0-ea06-47bb-8f52-5ea569c4ec31\") " pod="openshift-marketplace/redhat-marketplace-pplp4" Sep 30 07:57:38 crc kubenswrapper[4760]: I0930 07:57:38.757754 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b53f9fb0-ea06-47bb-8f52-5ea569c4ec31-utilities\") pod \"redhat-marketplace-pplp4\" (UID: \"b53f9fb0-ea06-47bb-8f52-5ea569c4ec31\") " pod="openshift-marketplace/redhat-marketplace-pplp4" Sep 30 07:57:38 crc kubenswrapper[4760]: I0930 07:57:38.757878 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgzgt\" (UniqueName: \"kubernetes.io/projected/b53f9fb0-ea06-47bb-8f52-5ea569c4ec31-kube-api-access-dgzgt\") pod \"redhat-marketplace-pplp4\" (UID: \"b53f9fb0-ea06-47bb-8f52-5ea569c4ec31\") " pod="openshift-marketplace/redhat-marketplace-pplp4" Sep 30 07:57:38 crc kubenswrapper[4760]: I0930 07:57:38.757935 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b53f9fb0-ea06-47bb-8f52-5ea569c4ec31-catalog-content\") pod \"redhat-marketplace-pplp4\" (UID: \"b53f9fb0-ea06-47bb-8f52-5ea569c4ec31\") " pod="openshift-marketplace/redhat-marketplace-pplp4" Sep 30 07:57:38 crc kubenswrapper[4760]: I0930 07:57:38.758409 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b53f9fb0-ea06-47bb-8f52-5ea569c4ec31-utilities\") pod \"redhat-marketplace-pplp4\" (UID: \"b53f9fb0-ea06-47bb-8f52-5ea569c4ec31\") " pod="openshift-marketplace/redhat-marketplace-pplp4" Sep 30 07:57:38 crc kubenswrapper[4760]: I0930 07:57:38.758812 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b53f9fb0-ea06-47bb-8f52-5ea569c4ec31-catalog-content\") pod \"redhat-marketplace-pplp4\" (UID: \"b53f9fb0-ea06-47bb-8f52-5ea569c4ec31\") " pod="openshift-marketplace/redhat-marketplace-pplp4" Sep 30 07:57:38 crc kubenswrapper[4760]: I0930 07:57:38.797678 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgzgt\" (UniqueName: \"kubernetes.io/projected/b53f9fb0-ea06-47bb-8f52-5ea569c4ec31-kube-api-access-dgzgt\") pod \"redhat-marketplace-pplp4\" (UID: \"b53f9fb0-ea06-47bb-8f52-5ea569c4ec31\") " pod="openshift-marketplace/redhat-marketplace-pplp4" Sep 30 07:57:38 crc kubenswrapper[4760]: I0930 07:57:38.876647 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pplp4" Sep 30 07:57:39 crc kubenswrapper[4760]: I0930 07:57:39.008975 4760 generic.go:334] "Generic (PLEG): container finished" podID="a24deab6-2aa5-4b18-9b88-a3bae3493049" containerID="9e399a86d5b8362501586a32f24ceb4e46fcc67382a64fa7e4cbeeeac1a54477" exitCode=0 Sep 30 07:57:39 crc kubenswrapper[4760]: I0930 07:57:39.009140 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vd2zk" event={"ID":"a24deab6-2aa5-4b18-9b88-a3bae3493049","Type":"ContainerDied","Data":"9e399a86d5b8362501586a32f24ceb4e46fcc67382a64fa7e4cbeeeac1a54477"} Sep 30 07:57:39 crc kubenswrapper[4760]: I0930 07:57:39.333178 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pplp4"] Sep 30 07:57:39 crc kubenswrapper[4760]: W0930 07:57:39.336715 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb53f9fb0_ea06_47bb_8f52_5ea569c4ec31.slice/crio-a7999e61a96ba749b1a995cfaa0680a9c5f4b54ec6fe48d428dcdf02dca5bf92 WatchSource:0}: Error finding container a7999e61a96ba749b1a995cfaa0680a9c5f4b54ec6fe48d428dcdf02dca5bf92: Status 404 returned error can't find the container with id a7999e61a96ba749b1a995cfaa0680a9c5f4b54ec6fe48d428dcdf02dca5bf92 Sep 30 07:57:40 crc kubenswrapper[4760]: I0930 07:57:40.024426 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vd2zk" event={"ID":"a24deab6-2aa5-4b18-9b88-a3bae3493049","Type":"ContainerStarted","Data":"f552753cc3df0d34f78f0b8dcdf1f57c2b85bc4fef36487da1fbaa146347c4f3"} Sep 30 07:57:40 crc kubenswrapper[4760]: I0930 07:57:40.029155 4760 generic.go:334] "Generic (PLEG): container finished" podID="b53f9fb0-ea06-47bb-8f52-5ea569c4ec31" containerID="811d5356622de1832f3ef9860a66548b08634c3ab98e8c2d905f472304476e0c" exitCode=0 Sep 30 07:57:40 crc kubenswrapper[4760]: I0930 07:57:40.029239 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pplp4" event={"ID":"b53f9fb0-ea06-47bb-8f52-5ea569c4ec31","Type":"ContainerDied","Data":"811d5356622de1832f3ef9860a66548b08634c3ab98e8c2d905f472304476e0c"} Sep 30 07:57:40 crc kubenswrapper[4760]: I0930 07:57:40.029358 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pplp4" event={"ID":"b53f9fb0-ea06-47bb-8f52-5ea569c4ec31","Type":"ContainerStarted","Data":"a7999e61a96ba749b1a995cfaa0680a9c5f4b54ec6fe48d428dcdf02dca5bf92"} Sep 30 07:57:40 crc kubenswrapper[4760]: I0930 07:57:40.069352 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vd2zk" podStartSLOduration=2.502221317 podStartE2EDuration="5.069324305s" podCreationTimestamp="2025-09-30 07:57:35 +0000 UTC" firstStartedPulling="2025-09-30 07:57:36.988052466 +0000 UTC m=+1442.630958908" lastFinishedPulling="2025-09-30 07:57:39.555155484 +0000 UTC m=+1445.198061896" observedRunningTime="2025-09-30 07:57:40.061790283 +0000 UTC m=+1445.704696695" watchObservedRunningTime="2025-09-30 07:57:40.069324305 +0000 UTC m=+1445.712230757" Sep 30 07:57:41 crc kubenswrapper[4760]: I0930 07:57:41.038829 4760 generic.go:334] "Generic (PLEG): container finished" podID="b53f9fb0-ea06-47bb-8f52-5ea569c4ec31" containerID="b26416b739b262f98f709b41c21d674b73385773be098ae02bf78bd132d341d1" exitCode=0 Sep 30 07:57:41 crc kubenswrapper[4760]: I0930 07:57:41.038921 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pplp4" event={"ID":"b53f9fb0-ea06-47bb-8f52-5ea569c4ec31","Type":"ContainerDied","Data":"b26416b739b262f98f709b41c21d674b73385773be098ae02bf78bd132d341d1"} Sep 30 07:57:42 crc kubenswrapper[4760]: I0930 07:57:42.048811 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pplp4" event={"ID":"b53f9fb0-ea06-47bb-8f52-5ea569c4ec31","Type":"ContainerStarted","Data":"c3b386b819ebcfba9f43f925b91f8e32454caa5f285c0cb19e9263e59d4ec735"} Sep 30 07:57:42 crc kubenswrapper[4760]: I0930 07:57:42.075832 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pplp4" podStartSLOduration=2.665163155 podStartE2EDuration="4.075816315s" podCreationTimestamp="2025-09-30 07:57:38 +0000 UTC" firstStartedPulling="2025-09-30 07:57:40.031014785 +0000 UTC m=+1445.673921197" lastFinishedPulling="2025-09-30 07:57:41.441667945 +0000 UTC m=+1447.084574357" observedRunningTime="2025-09-30 07:57:42.066718042 +0000 UTC m=+1447.709624454" watchObservedRunningTime="2025-09-30 07:57:42.075816315 +0000 UTC m=+1447.718722727" Sep 30 07:57:46 crc kubenswrapper[4760]: I0930 07:57:46.085981 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vd2zk" Sep 30 07:57:46 crc kubenswrapper[4760]: I0930 07:57:46.087636 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vd2zk" Sep 30 07:57:46 crc kubenswrapper[4760]: I0930 07:57:46.161034 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vd2zk" Sep 30 07:57:47 crc kubenswrapper[4760]: I0930 07:57:47.177552 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vd2zk" Sep 30 07:57:47 crc kubenswrapper[4760]: I0930 07:57:47.245709 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vd2zk"] Sep 30 07:57:48 crc kubenswrapper[4760]: I0930 07:57:48.877772 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pplp4" Sep 30 07:57:48 crc kubenswrapper[4760]: I0930 07:57:48.877849 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pplp4" Sep 30 07:57:48 crc kubenswrapper[4760]: I0930 07:57:48.936474 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pplp4" Sep 30 07:57:49 crc kubenswrapper[4760]: I0930 07:57:49.115657 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:57:49 crc kubenswrapper[4760]: I0930 07:57:49.115963 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:57:49 crc kubenswrapper[4760]: I0930 07:57:49.130096 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vd2zk" podUID="a24deab6-2aa5-4b18-9b88-a3bae3493049" containerName="registry-server" containerID="cri-o://f552753cc3df0d34f78f0b8dcdf1f57c2b85bc4fef36487da1fbaa146347c4f3" gracePeriod=2 Sep 30 07:57:49 crc kubenswrapper[4760]: I0930 07:57:49.199123 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pplp4" Sep 30 07:57:49 crc kubenswrapper[4760]: I0930 07:57:49.610957 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vd2zk" Sep 30 07:57:49 crc kubenswrapper[4760]: I0930 07:57:49.785969 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a24deab6-2aa5-4b18-9b88-a3bae3493049-catalog-content\") pod \"a24deab6-2aa5-4b18-9b88-a3bae3493049\" (UID: \"a24deab6-2aa5-4b18-9b88-a3bae3493049\") " Sep 30 07:57:49 crc kubenswrapper[4760]: I0930 07:57:49.786138 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4jc6\" (UniqueName: \"kubernetes.io/projected/a24deab6-2aa5-4b18-9b88-a3bae3493049-kube-api-access-w4jc6\") pod \"a24deab6-2aa5-4b18-9b88-a3bae3493049\" (UID: \"a24deab6-2aa5-4b18-9b88-a3bae3493049\") " Sep 30 07:57:49 crc kubenswrapper[4760]: I0930 07:57:49.786276 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a24deab6-2aa5-4b18-9b88-a3bae3493049-utilities\") pod \"a24deab6-2aa5-4b18-9b88-a3bae3493049\" (UID: \"a24deab6-2aa5-4b18-9b88-a3bae3493049\") " Sep 30 07:57:49 crc kubenswrapper[4760]: I0930 07:57:49.788182 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a24deab6-2aa5-4b18-9b88-a3bae3493049-utilities" (OuterVolumeSpecName: "utilities") pod "a24deab6-2aa5-4b18-9b88-a3bae3493049" (UID: "a24deab6-2aa5-4b18-9b88-a3bae3493049"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:57:49 crc kubenswrapper[4760]: I0930 07:57:49.798231 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a24deab6-2aa5-4b18-9b88-a3bae3493049-kube-api-access-w4jc6" (OuterVolumeSpecName: "kube-api-access-w4jc6") pod "a24deab6-2aa5-4b18-9b88-a3bae3493049" (UID: "a24deab6-2aa5-4b18-9b88-a3bae3493049"). InnerVolumeSpecName "kube-api-access-w4jc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:57:49 crc kubenswrapper[4760]: I0930 07:57:49.828712 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pplp4"] Sep 30 07:57:49 crc kubenswrapper[4760]: I0930 07:57:49.851068 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a24deab6-2aa5-4b18-9b88-a3bae3493049-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a24deab6-2aa5-4b18-9b88-a3bae3493049" (UID: "a24deab6-2aa5-4b18-9b88-a3bae3493049"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:57:49 crc kubenswrapper[4760]: I0930 07:57:49.888859 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a24deab6-2aa5-4b18-9b88-a3bae3493049-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:57:49 crc kubenswrapper[4760]: I0930 07:57:49.888895 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4jc6\" (UniqueName: \"kubernetes.io/projected/a24deab6-2aa5-4b18-9b88-a3bae3493049-kube-api-access-w4jc6\") on node \"crc\" DevicePath \"\"" Sep 30 07:57:49 crc kubenswrapper[4760]: I0930 07:57:49.888910 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a24deab6-2aa5-4b18-9b88-a3bae3493049-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:57:50 crc kubenswrapper[4760]: I0930 07:57:50.156339 4760 generic.go:334] "Generic (PLEG): container finished" podID="a24deab6-2aa5-4b18-9b88-a3bae3493049" containerID="f552753cc3df0d34f78f0b8dcdf1f57c2b85bc4fef36487da1fbaa146347c4f3" exitCode=0 Sep 30 07:57:50 crc kubenswrapper[4760]: I0930 07:57:50.156407 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vd2zk" event={"ID":"a24deab6-2aa5-4b18-9b88-a3bae3493049","Type":"ContainerDied","Data":"f552753cc3df0d34f78f0b8dcdf1f57c2b85bc4fef36487da1fbaa146347c4f3"} Sep 30 07:57:50 crc kubenswrapper[4760]: I0930 07:57:50.156464 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vd2zk" event={"ID":"a24deab6-2aa5-4b18-9b88-a3bae3493049","Type":"ContainerDied","Data":"cbdc73d038827f7799c7c003dc6fb19928c248174d007f567a5b607f8129e09e"} Sep 30 07:57:50 crc kubenswrapper[4760]: I0930 07:57:50.156488 4760 scope.go:117] "RemoveContainer" containerID="f552753cc3df0d34f78f0b8dcdf1f57c2b85bc4fef36487da1fbaa146347c4f3" Sep 30 07:57:50 crc kubenswrapper[4760]: I0930 07:57:50.156413 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vd2zk" Sep 30 07:57:50 crc kubenswrapper[4760]: I0930 07:57:50.199456 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vd2zk"] Sep 30 07:57:50 crc kubenswrapper[4760]: I0930 07:57:50.210446 4760 scope.go:117] "RemoveContainer" containerID="9e399a86d5b8362501586a32f24ceb4e46fcc67382a64fa7e4cbeeeac1a54477" Sep 30 07:57:50 crc kubenswrapper[4760]: I0930 07:57:50.211326 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vd2zk"] Sep 30 07:57:50 crc kubenswrapper[4760]: I0930 07:57:50.246800 4760 scope.go:117] "RemoveContainer" containerID="4b22a77d496113b7fb4acd1010f0814849f2500ee77018624834c994b9cb5e92" Sep 30 07:57:50 crc kubenswrapper[4760]: I0930 07:57:50.302893 4760 scope.go:117] "RemoveContainer" containerID="f552753cc3df0d34f78f0b8dcdf1f57c2b85bc4fef36487da1fbaa146347c4f3" Sep 30 07:57:50 crc kubenswrapper[4760]: E0930 07:57:50.303775 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f552753cc3df0d34f78f0b8dcdf1f57c2b85bc4fef36487da1fbaa146347c4f3\": container with ID starting with f552753cc3df0d34f78f0b8dcdf1f57c2b85bc4fef36487da1fbaa146347c4f3 not found: ID does not exist" containerID="f552753cc3df0d34f78f0b8dcdf1f57c2b85bc4fef36487da1fbaa146347c4f3" Sep 30 07:57:50 crc kubenswrapper[4760]: I0930 07:57:50.303838 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f552753cc3df0d34f78f0b8dcdf1f57c2b85bc4fef36487da1fbaa146347c4f3"} err="failed to get container status \"f552753cc3df0d34f78f0b8dcdf1f57c2b85bc4fef36487da1fbaa146347c4f3\": rpc error: code = NotFound desc = could not find container \"f552753cc3df0d34f78f0b8dcdf1f57c2b85bc4fef36487da1fbaa146347c4f3\": container with ID starting with f552753cc3df0d34f78f0b8dcdf1f57c2b85bc4fef36487da1fbaa146347c4f3 not found: ID does not exist" Sep 30 07:57:50 crc kubenswrapper[4760]: I0930 07:57:50.303875 4760 scope.go:117] "RemoveContainer" containerID="9e399a86d5b8362501586a32f24ceb4e46fcc67382a64fa7e4cbeeeac1a54477" Sep 30 07:57:50 crc kubenswrapper[4760]: E0930 07:57:50.304380 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e399a86d5b8362501586a32f24ceb4e46fcc67382a64fa7e4cbeeeac1a54477\": container with ID starting with 9e399a86d5b8362501586a32f24ceb4e46fcc67382a64fa7e4cbeeeac1a54477 not found: ID does not exist" containerID="9e399a86d5b8362501586a32f24ceb4e46fcc67382a64fa7e4cbeeeac1a54477" Sep 30 07:57:50 crc kubenswrapper[4760]: I0930 07:57:50.304418 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e399a86d5b8362501586a32f24ceb4e46fcc67382a64fa7e4cbeeeac1a54477"} err="failed to get container status \"9e399a86d5b8362501586a32f24ceb4e46fcc67382a64fa7e4cbeeeac1a54477\": rpc error: code = NotFound desc = could not find container \"9e399a86d5b8362501586a32f24ceb4e46fcc67382a64fa7e4cbeeeac1a54477\": container with ID starting with 9e399a86d5b8362501586a32f24ceb4e46fcc67382a64fa7e4cbeeeac1a54477 not found: ID does not exist" Sep 30 07:57:50 crc kubenswrapper[4760]: I0930 07:57:50.304437 4760 scope.go:117] "RemoveContainer" containerID="4b22a77d496113b7fb4acd1010f0814849f2500ee77018624834c994b9cb5e92" Sep 30 07:57:50 crc kubenswrapper[4760]: E0930 07:57:50.304922 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b22a77d496113b7fb4acd1010f0814849f2500ee77018624834c994b9cb5e92\": container with ID starting with 4b22a77d496113b7fb4acd1010f0814849f2500ee77018624834c994b9cb5e92 not found: ID does not exist" containerID="4b22a77d496113b7fb4acd1010f0814849f2500ee77018624834c994b9cb5e92" Sep 30 07:57:50 crc kubenswrapper[4760]: I0930 07:57:50.305142 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b22a77d496113b7fb4acd1010f0814849f2500ee77018624834c994b9cb5e92"} err="failed to get container status \"4b22a77d496113b7fb4acd1010f0814849f2500ee77018624834c994b9cb5e92\": rpc error: code = NotFound desc = could not find container \"4b22a77d496113b7fb4acd1010f0814849f2500ee77018624834c994b9cb5e92\": container with ID starting with 4b22a77d496113b7fb4acd1010f0814849f2500ee77018624834c994b9cb5e92 not found: ID does not exist" Sep 30 07:57:51 crc kubenswrapper[4760]: I0930 07:57:51.077994 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a24deab6-2aa5-4b18-9b88-a3bae3493049" path="/var/lib/kubelet/pods/a24deab6-2aa5-4b18-9b88-a3bae3493049/volumes" Sep 30 07:57:51 crc kubenswrapper[4760]: I0930 07:57:51.168366 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pplp4" podUID="b53f9fb0-ea06-47bb-8f52-5ea569c4ec31" containerName="registry-server" containerID="cri-o://c3b386b819ebcfba9f43f925b91f8e32454caa5f285c0cb19e9263e59d4ec735" gracePeriod=2 Sep 30 07:57:51 crc kubenswrapper[4760]: I0930 07:57:51.691006 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pplp4" Sep 30 07:57:51 crc kubenswrapper[4760]: I0930 07:57:51.733573 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgzgt\" (UniqueName: \"kubernetes.io/projected/b53f9fb0-ea06-47bb-8f52-5ea569c4ec31-kube-api-access-dgzgt\") pod \"b53f9fb0-ea06-47bb-8f52-5ea569c4ec31\" (UID: \"b53f9fb0-ea06-47bb-8f52-5ea569c4ec31\") " Sep 30 07:57:51 crc kubenswrapper[4760]: I0930 07:57:51.741620 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b53f9fb0-ea06-47bb-8f52-5ea569c4ec31-kube-api-access-dgzgt" (OuterVolumeSpecName: "kube-api-access-dgzgt") pod "b53f9fb0-ea06-47bb-8f52-5ea569c4ec31" (UID: "b53f9fb0-ea06-47bb-8f52-5ea569c4ec31"). InnerVolumeSpecName "kube-api-access-dgzgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:57:51 crc kubenswrapper[4760]: I0930 07:57:51.834780 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b53f9fb0-ea06-47bb-8f52-5ea569c4ec31-catalog-content\") pod \"b53f9fb0-ea06-47bb-8f52-5ea569c4ec31\" (UID: \"b53f9fb0-ea06-47bb-8f52-5ea569c4ec31\") " Sep 30 07:57:51 crc kubenswrapper[4760]: I0930 07:57:51.835128 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b53f9fb0-ea06-47bb-8f52-5ea569c4ec31-utilities\") pod \"b53f9fb0-ea06-47bb-8f52-5ea569c4ec31\" (UID: \"b53f9fb0-ea06-47bb-8f52-5ea569c4ec31\") " Sep 30 07:57:51 crc kubenswrapper[4760]: I0930 07:57:51.835598 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgzgt\" (UniqueName: \"kubernetes.io/projected/b53f9fb0-ea06-47bb-8f52-5ea569c4ec31-kube-api-access-dgzgt\") on node \"crc\" DevicePath \"\"" Sep 30 07:57:51 crc kubenswrapper[4760]: I0930 07:57:51.836792 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b53f9fb0-ea06-47bb-8f52-5ea569c4ec31-utilities" (OuterVolumeSpecName: "utilities") pod "b53f9fb0-ea06-47bb-8f52-5ea569c4ec31" (UID: "b53f9fb0-ea06-47bb-8f52-5ea569c4ec31"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:57:51 crc kubenswrapper[4760]: I0930 07:57:51.857327 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b53f9fb0-ea06-47bb-8f52-5ea569c4ec31-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b53f9fb0-ea06-47bb-8f52-5ea569c4ec31" (UID: "b53f9fb0-ea06-47bb-8f52-5ea569c4ec31"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:57:51 crc kubenswrapper[4760]: I0930 07:57:51.938811 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b53f9fb0-ea06-47bb-8f52-5ea569c4ec31-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:57:51 crc kubenswrapper[4760]: I0930 07:57:51.938866 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b53f9fb0-ea06-47bb-8f52-5ea569c4ec31-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:57:52 crc kubenswrapper[4760]: I0930 07:57:52.180157 4760 generic.go:334] "Generic (PLEG): container finished" podID="b53f9fb0-ea06-47bb-8f52-5ea569c4ec31" containerID="c3b386b819ebcfba9f43f925b91f8e32454caa5f285c0cb19e9263e59d4ec735" exitCode=0 Sep 30 07:57:52 crc kubenswrapper[4760]: I0930 07:57:52.180264 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pplp4" Sep 30 07:57:52 crc kubenswrapper[4760]: I0930 07:57:52.180980 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pplp4" event={"ID":"b53f9fb0-ea06-47bb-8f52-5ea569c4ec31","Type":"ContainerDied","Data":"c3b386b819ebcfba9f43f925b91f8e32454caa5f285c0cb19e9263e59d4ec735"} Sep 30 07:57:52 crc kubenswrapper[4760]: I0930 07:57:52.181074 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pplp4" event={"ID":"b53f9fb0-ea06-47bb-8f52-5ea569c4ec31","Type":"ContainerDied","Data":"a7999e61a96ba749b1a995cfaa0680a9c5f4b54ec6fe48d428dcdf02dca5bf92"} Sep 30 07:57:52 crc kubenswrapper[4760]: I0930 07:57:52.181150 4760 scope.go:117] "RemoveContainer" containerID="c3b386b819ebcfba9f43f925b91f8e32454caa5f285c0cb19e9263e59d4ec735" Sep 30 07:57:52 crc kubenswrapper[4760]: I0930 07:57:52.210574 4760 scope.go:117] "RemoveContainer" containerID="b26416b739b262f98f709b41c21d674b73385773be098ae02bf78bd132d341d1" Sep 30 07:57:52 crc kubenswrapper[4760]: I0930 07:57:52.223485 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pplp4"] Sep 30 07:57:52 crc kubenswrapper[4760]: I0930 07:57:52.244167 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pplp4"] Sep 30 07:57:52 crc kubenswrapper[4760]: I0930 07:57:52.265553 4760 scope.go:117] "RemoveContainer" containerID="811d5356622de1832f3ef9860a66548b08634c3ab98e8c2d905f472304476e0c" Sep 30 07:57:52 crc kubenswrapper[4760]: I0930 07:57:52.322249 4760 scope.go:117] "RemoveContainer" containerID="c3b386b819ebcfba9f43f925b91f8e32454caa5f285c0cb19e9263e59d4ec735" Sep 30 07:57:52 crc kubenswrapper[4760]: E0930 07:57:52.322986 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3b386b819ebcfba9f43f925b91f8e32454caa5f285c0cb19e9263e59d4ec735\": container with ID starting with c3b386b819ebcfba9f43f925b91f8e32454caa5f285c0cb19e9263e59d4ec735 not found: ID does not exist" containerID="c3b386b819ebcfba9f43f925b91f8e32454caa5f285c0cb19e9263e59d4ec735" Sep 30 07:57:52 crc kubenswrapper[4760]: I0930 07:57:52.323035 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3b386b819ebcfba9f43f925b91f8e32454caa5f285c0cb19e9263e59d4ec735"} err="failed to get container status \"c3b386b819ebcfba9f43f925b91f8e32454caa5f285c0cb19e9263e59d4ec735\": rpc error: code = NotFound desc = could not find container \"c3b386b819ebcfba9f43f925b91f8e32454caa5f285c0cb19e9263e59d4ec735\": container with ID starting with c3b386b819ebcfba9f43f925b91f8e32454caa5f285c0cb19e9263e59d4ec735 not found: ID does not exist" Sep 30 07:57:52 crc kubenswrapper[4760]: I0930 07:57:52.323065 4760 scope.go:117] "RemoveContainer" containerID="b26416b739b262f98f709b41c21d674b73385773be098ae02bf78bd132d341d1" Sep 30 07:57:52 crc kubenswrapper[4760]: E0930 07:57:52.323546 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b26416b739b262f98f709b41c21d674b73385773be098ae02bf78bd132d341d1\": container with ID starting with b26416b739b262f98f709b41c21d674b73385773be098ae02bf78bd132d341d1 not found: ID does not exist" containerID="b26416b739b262f98f709b41c21d674b73385773be098ae02bf78bd132d341d1" Sep 30 07:57:52 crc kubenswrapper[4760]: I0930 07:57:52.323566 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b26416b739b262f98f709b41c21d674b73385773be098ae02bf78bd132d341d1"} err="failed to get container status \"b26416b739b262f98f709b41c21d674b73385773be098ae02bf78bd132d341d1\": rpc error: code = NotFound desc = could not find container \"b26416b739b262f98f709b41c21d674b73385773be098ae02bf78bd132d341d1\": container with ID starting with b26416b739b262f98f709b41c21d674b73385773be098ae02bf78bd132d341d1 not found: ID does not exist" Sep 30 07:57:52 crc kubenswrapper[4760]: I0930 07:57:52.323580 4760 scope.go:117] "RemoveContainer" containerID="811d5356622de1832f3ef9860a66548b08634c3ab98e8c2d905f472304476e0c" Sep 30 07:57:52 crc kubenswrapper[4760]: E0930 07:57:52.323946 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"811d5356622de1832f3ef9860a66548b08634c3ab98e8c2d905f472304476e0c\": container with ID starting with 811d5356622de1832f3ef9860a66548b08634c3ab98e8c2d905f472304476e0c not found: ID does not exist" containerID="811d5356622de1832f3ef9860a66548b08634c3ab98e8c2d905f472304476e0c" Sep 30 07:57:52 crc kubenswrapper[4760]: I0930 07:57:52.323969 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"811d5356622de1832f3ef9860a66548b08634c3ab98e8c2d905f472304476e0c"} err="failed to get container status \"811d5356622de1832f3ef9860a66548b08634c3ab98e8c2d905f472304476e0c\": rpc error: code = NotFound desc = could not find container \"811d5356622de1832f3ef9860a66548b08634c3ab98e8c2d905f472304476e0c\": container with ID starting with 811d5356622de1832f3ef9860a66548b08634c3ab98e8c2d905f472304476e0c not found: ID does not exist" Sep 30 07:57:53 crc kubenswrapper[4760]: I0930 07:57:53.082976 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b53f9fb0-ea06-47bb-8f52-5ea569c4ec31" path="/var/lib/kubelet/pods/b53f9fb0-ea06-47bb-8f52-5ea569c4ec31/volumes" Sep 30 07:58:19 crc kubenswrapper[4760]: I0930 07:58:19.113375 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:58:19 crc kubenswrapper[4760]: I0930 07:58:19.113810 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:58:19 crc kubenswrapper[4760]: I0930 07:58:19.113861 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 07:58:19 crc kubenswrapper[4760]: I0930 07:58:19.114705 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8"} pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 07:58:19 crc kubenswrapper[4760]: I0930 07:58:19.114779 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" containerID="cri-o://6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" gracePeriod=600 Sep 30 07:58:19 crc kubenswrapper[4760]: E0930 07:58:19.238671 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 07:58:19 crc kubenswrapper[4760]: I0930 07:58:19.501808 4760 generic.go:334] "Generic (PLEG): container finished" podID="7a9c8270-6964-4886-87d0-227b05b76da4" containerID="6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" exitCode=0 Sep 30 07:58:19 crc kubenswrapper[4760]: I0930 07:58:19.501895 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerDied","Data":"6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8"} Sep 30 07:58:19 crc kubenswrapper[4760]: I0930 07:58:19.502558 4760 scope.go:117] "RemoveContainer" containerID="dd30dd4d28eef306568f8541bb8d83a7c0af086ec623d77ff729c59fba19ae20" Sep 30 07:58:19 crc kubenswrapper[4760]: I0930 07:58:19.503469 4760 scope.go:117] "RemoveContainer" containerID="6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" Sep 30 07:58:19 crc kubenswrapper[4760]: E0930 07:58:19.503935 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 07:58:31 crc kubenswrapper[4760]: I0930 07:58:31.067422 4760 scope.go:117] "RemoveContainer" containerID="6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" Sep 30 07:58:31 crc kubenswrapper[4760]: E0930 07:58:31.068124 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 07:58:40 crc kubenswrapper[4760]: I0930 07:58:40.747886 4760 generic.go:334] "Generic (PLEG): container finished" podID="64e019eb-1763-4e9e-8c00-c4312d782981" containerID="8a24a09c17ff3ec08bd0602228486459b7734168ba81f82d5aeccfa7c73016f9" exitCode=0 Sep 30 07:58:40 crc kubenswrapper[4760]: I0930 07:58:40.747986 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6" event={"ID":"64e019eb-1763-4e9e-8c00-c4312d782981","Type":"ContainerDied","Data":"8a24a09c17ff3ec08bd0602228486459b7734168ba81f82d5aeccfa7c73016f9"} Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.232455 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.322643 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64e019eb-1763-4e9e-8c00-c4312d782981-inventory\") pod \"64e019eb-1763-4e9e-8c00-c4312d782981\" (UID: \"64e019eb-1763-4e9e-8c00-c4312d782981\") " Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.322789 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gcbx\" (UniqueName: \"kubernetes.io/projected/64e019eb-1763-4e9e-8c00-c4312d782981-kube-api-access-2gcbx\") pod \"64e019eb-1763-4e9e-8c00-c4312d782981\" (UID: \"64e019eb-1763-4e9e-8c00-c4312d782981\") " Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.322831 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/64e019eb-1763-4e9e-8c00-c4312d782981-ssh-key\") pod \"64e019eb-1763-4e9e-8c00-c4312d782981\" (UID: \"64e019eb-1763-4e9e-8c00-c4312d782981\") " Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.322925 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64e019eb-1763-4e9e-8c00-c4312d782981-bootstrap-combined-ca-bundle\") pod \"64e019eb-1763-4e9e-8c00-c4312d782981\" (UID: \"64e019eb-1763-4e9e-8c00-c4312d782981\") " Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.329580 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64e019eb-1763-4e9e-8c00-c4312d782981-kube-api-access-2gcbx" (OuterVolumeSpecName: "kube-api-access-2gcbx") pod "64e019eb-1763-4e9e-8c00-c4312d782981" (UID: "64e019eb-1763-4e9e-8c00-c4312d782981"). InnerVolumeSpecName "kube-api-access-2gcbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.336434 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64e019eb-1763-4e9e-8c00-c4312d782981-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "64e019eb-1763-4e9e-8c00-c4312d782981" (UID: "64e019eb-1763-4e9e-8c00-c4312d782981"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.355564 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64e019eb-1763-4e9e-8c00-c4312d782981-inventory" (OuterVolumeSpecName: "inventory") pod "64e019eb-1763-4e9e-8c00-c4312d782981" (UID: "64e019eb-1763-4e9e-8c00-c4312d782981"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.365660 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64e019eb-1763-4e9e-8c00-c4312d782981-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "64e019eb-1763-4e9e-8c00-c4312d782981" (UID: "64e019eb-1763-4e9e-8c00-c4312d782981"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.427015 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64e019eb-1763-4e9e-8c00-c4312d782981-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.427053 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gcbx\" (UniqueName: \"kubernetes.io/projected/64e019eb-1763-4e9e-8c00-c4312d782981-kube-api-access-2gcbx\") on node \"crc\" DevicePath \"\"" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.427071 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/64e019eb-1763-4e9e-8c00-c4312d782981-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.427086 4760 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64e019eb-1763-4e9e-8c00-c4312d782981-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.777412 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6" event={"ID":"64e019eb-1763-4e9e-8c00-c4312d782981","Type":"ContainerDied","Data":"b46223e1e52c22fcff461a7c5bac7cbaed4b8217a453bfb76065fb7063e10408"} Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.777472 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b46223e1e52c22fcff461a7c5bac7cbaed4b8217a453bfb76065fb7063e10408" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.777532 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.888975 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd"] Sep 30 07:58:42 crc kubenswrapper[4760]: E0930 07:58:42.889798 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b53f9fb0-ea06-47bb-8f52-5ea569c4ec31" containerName="extract-content" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.889819 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="b53f9fb0-ea06-47bb-8f52-5ea569c4ec31" containerName="extract-content" Sep 30 07:58:42 crc kubenswrapper[4760]: E0930 07:58:42.889836 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b53f9fb0-ea06-47bb-8f52-5ea569c4ec31" containerName="extract-utilities" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.889845 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="b53f9fb0-ea06-47bb-8f52-5ea569c4ec31" containerName="extract-utilities" Sep 30 07:58:42 crc kubenswrapper[4760]: E0930 07:58:42.889861 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a24deab6-2aa5-4b18-9b88-a3bae3493049" containerName="extract-utilities" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.889870 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="a24deab6-2aa5-4b18-9b88-a3bae3493049" containerName="extract-utilities" Sep 30 07:58:42 crc kubenswrapper[4760]: E0930 07:58:42.889887 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64e019eb-1763-4e9e-8c00-c4312d782981" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.889895 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="64e019eb-1763-4e9e-8c00-c4312d782981" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 07:58:42 crc kubenswrapper[4760]: E0930 07:58:42.889916 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a24deab6-2aa5-4b18-9b88-a3bae3493049" containerName="extract-content" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.889924 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="a24deab6-2aa5-4b18-9b88-a3bae3493049" containerName="extract-content" Sep 30 07:58:42 crc kubenswrapper[4760]: E0930 07:58:42.889957 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b53f9fb0-ea06-47bb-8f52-5ea569c4ec31" containerName="registry-server" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.889965 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="b53f9fb0-ea06-47bb-8f52-5ea569c4ec31" containerName="registry-server" Sep 30 07:58:42 crc kubenswrapper[4760]: E0930 07:58:42.889975 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a24deab6-2aa5-4b18-9b88-a3bae3493049" containerName="registry-server" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.889982 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="a24deab6-2aa5-4b18-9b88-a3bae3493049" containerName="registry-server" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.890210 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="a24deab6-2aa5-4b18-9b88-a3bae3493049" containerName="registry-server" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.890229 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="64e019eb-1763-4e9e-8c00-c4312d782981" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.890243 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="b53f9fb0-ea06-47bb-8f52-5ea569c4ec31" containerName="registry-server" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.891021 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.893263 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.893595 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.895587 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.908380 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8gxrf" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.938793 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq2qg\" (UniqueName: \"kubernetes.io/projected/87fc7cca-6571-4e27-ab1e-14648064566e-kube-api-access-wq2qg\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd\" (UID: \"87fc7cca-6571-4e27-ab1e-14648064566e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.938897 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87fc7cca-6571-4e27-ab1e-14648064566e-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd\" (UID: \"87fc7cca-6571-4e27-ab1e-14648064566e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.938939 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87fc7cca-6571-4e27-ab1e-14648064566e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd\" (UID: \"87fc7cca-6571-4e27-ab1e-14648064566e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd" Sep 30 07:58:42 crc kubenswrapper[4760]: I0930 07:58:42.946292 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd"] Sep 30 07:58:43 crc kubenswrapper[4760]: I0930 07:58:43.041217 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87fc7cca-6571-4e27-ab1e-14648064566e-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd\" (UID: \"87fc7cca-6571-4e27-ab1e-14648064566e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd" Sep 30 07:58:43 crc kubenswrapper[4760]: I0930 07:58:43.041340 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87fc7cca-6571-4e27-ab1e-14648064566e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd\" (UID: \"87fc7cca-6571-4e27-ab1e-14648064566e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd" Sep 30 07:58:43 crc kubenswrapper[4760]: I0930 07:58:43.041553 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq2qg\" (UniqueName: \"kubernetes.io/projected/87fc7cca-6571-4e27-ab1e-14648064566e-kube-api-access-wq2qg\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd\" (UID: \"87fc7cca-6571-4e27-ab1e-14648064566e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd" Sep 30 07:58:43 crc kubenswrapper[4760]: I0930 07:58:43.045875 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87fc7cca-6571-4e27-ab1e-14648064566e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd\" (UID: \"87fc7cca-6571-4e27-ab1e-14648064566e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd" Sep 30 07:58:43 crc kubenswrapper[4760]: I0930 07:58:43.046602 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87fc7cca-6571-4e27-ab1e-14648064566e-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd\" (UID: \"87fc7cca-6571-4e27-ab1e-14648064566e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd" Sep 30 07:58:43 crc kubenswrapper[4760]: I0930 07:58:43.057366 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq2qg\" (UniqueName: \"kubernetes.io/projected/87fc7cca-6571-4e27-ab1e-14648064566e-kube-api-access-wq2qg\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd\" (UID: \"87fc7cca-6571-4e27-ab1e-14648064566e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd" Sep 30 07:58:43 crc kubenswrapper[4760]: I0930 07:58:43.066965 4760 scope.go:117] "RemoveContainer" containerID="6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" Sep 30 07:58:43 crc kubenswrapper[4760]: E0930 07:58:43.067204 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 07:58:43 crc kubenswrapper[4760]: I0930 07:58:43.220639 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd" Sep 30 07:58:43 crc kubenswrapper[4760]: I0930 07:58:43.840207 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd"] Sep 30 07:58:43 crc kubenswrapper[4760]: I0930 07:58:43.850285 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 07:58:44 crc kubenswrapper[4760]: I0930 07:58:44.817767 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd" event={"ID":"87fc7cca-6571-4e27-ab1e-14648064566e","Type":"ContainerStarted","Data":"8553169d43699b122e7ac80d578f16f226819141285bc61337350327c1531725"} Sep 30 07:58:44 crc kubenswrapper[4760]: I0930 07:58:44.817869 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd" event={"ID":"87fc7cca-6571-4e27-ab1e-14648064566e","Type":"ContainerStarted","Data":"9623cc9db93aafb065503c3e7033dcd499c230b2bb199cc1f1ab4cc0837b8954"} Sep 30 07:58:44 crc kubenswrapper[4760]: I0930 07:58:44.858657 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd" podStartSLOduration=2.307525346 podStartE2EDuration="2.858632373s" podCreationTimestamp="2025-09-30 07:58:42 +0000 UTC" firstStartedPulling="2025-09-30 07:58:43.849960963 +0000 UTC m=+1509.492867375" lastFinishedPulling="2025-09-30 07:58:44.40106795 +0000 UTC m=+1510.043974402" observedRunningTime="2025-09-30 07:58:44.847724873 +0000 UTC m=+1510.490631305" watchObservedRunningTime="2025-09-30 07:58:44.858632373 +0000 UTC m=+1510.501538825" Sep 30 07:58:55 crc kubenswrapper[4760]: I0930 07:58:55.075364 4760 scope.go:117] "RemoveContainer" containerID="6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" Sep 30 07:58:55 crc kubenswrapper[4760]: E0930 07:58:55.076209 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 07:59:09 crc kubenswrapper[4760]: I0930 07:59:09.067769 4760 scope.go:117] "RemoveContainer" containerID="6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" Sep 30 07:59:09 crc kubenswrapper[4760]: E0930 07:59:09.068701 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 07:59:24 crc kubenswrapper[4760]: I0930 07:59:24.067135 4760 scope.go:117] "RemoveContainer" containerID="6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" Sep 30 07:59:24 crc kubenswrapper[4760]: E0930 07:59:24.068218 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 07:59:32 crc kubenswrapper[4760]: I0930 07:59:32.037572 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-zvwbw"] Sep 30 07:59:32 crc kubenswrapper[4760]: I0930 07:59:32.053524 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-fkqrw"] Sep 30 07:59:32 crc kubenswrapper[4760]: I0930 07:59:32.065335 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-xsqh2"] Sep 30 07:59:32 crc kubenswrapper[4760]: I0930 07:59:32.077079 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-zvwbw"] Sep 30 07:59:32 crc kubenswrapper[4760]: I0930 07:59:32.094844 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-fkqrw"] Sep 30 07:59:32 crc kubenswrapper[4760]: I0930 07:59:32.107437 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-xsqh2"] Sep 30 07:59:33 crc kubenswrapper[4760]: I0930 07:59:33.083643 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60c66a34-554e-4655-8da8-e47e3e10a521" path="/var/lib/kubelet/pods/60c66a34-554e-4655-8da8-e47e3e10a521/volumes" Sep 30 07:59:33 crc kubenswrapper[4760]: I0930 07:59:33.085740 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3bc4bc8-f5fa-4aa1-b24a-e742458d48ab" path="/var/lib/kubelet/pods/b3bc4bc8-f5fa-4aa1-b24a-e742458d48ab/volumes" Sep 30 07:59:33 crc kubenswrapper[4760]: I0930 07:59:33.086806 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c941e149-5425-4ef2-9920-a7c6797230be" path="/var/lib/kubelet/pods/c941e149-5425-4ef2-9920-a7c6797230be/volumes" Sep 30 07:59:35 crc kubenswrapper[4760]: I0930 07:59:35.034560 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-l747q"] Sep 30 07:59:35 crc kubenswrapper[4760]: I0930 07:59:35.049444 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-l747q"] Sep 30 07:59:35 crc kubenswrapper[4760]: I0930 07:59:35.091043 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="636599f2-9be2-4380-b923-0f3b3b77e39b" path="/var/lib/kubelet/pods/636599f2-9be2-4380-b923-0f3b3b77e39b/volumes" Sep 30 07:59:37 crc kubenswrapper[4760]: I0930 07:59:37.067472 4760 scope.go:117] "RemoveContainer" containerID="6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" Sep 30 07:59:37 crc kubenswrapper[4760]: E0930 07:59:37.068977 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 07:59:41 crc kubenswrapper[4760]: I0930 07:59:41.033782 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-d462-account-create-d7bk9"] Sep 30 07:59:41 crc kubenswrapper[4760]: I0930 07:59:41.050771 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-d462-account-create-d7bk9"] Sep 30 07:59:41 crc kubenswrapper[4760]: I0930 07:59:41.079318 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba2f3da0-95aa-40f3-b3cf-1b3d3a1ee6b6" path="/var/lib/kubelet/pods/ba2f3da0-95aa-40f3-b3cf-1b3d3a1ee6b6/volumes" Sep 30 07:59:43 crc kubenswrapper[4760]: I0930 07:59:43.040479 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-8f95-account-create-zwzdl"] Sep 30 07:59:43 crc kubenswrapper[4760]: I0930 07:59:43.053672 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-3371-account-create-dhxdh"] Sep 30 07:59:43 crc kubenswrapper[4760]: I0930 07:59:43.085806 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-8f95-account-create-zwzdl"] Sep 30 07:59:43 crc kubenswrapper[4760]: I0930 07:59:43.085871 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-3371-account-create-dhxdh"] Sep 30 07:59:43 crc kubenswrapper[4760]: I0930 07:59:43.571490 4760 scope.go:117] "RemoveContainer" containerID="06a3614cc990478bd72bb96c0f784d252bbca1242a48e48f899d2babf860b744" Sep 30 07:59:43 crc kubenswrapper[4760]: I0930 07:59:43.614549 4760 scope.go:117] "RemoveContainer" containerID="163915f4279ad5b742fff9a6114f00bc851f93640cacd34cc00b2fb3c9589df6" Sep 30 07:59:43 crc kubenswrapper[4760]: I0930 07:59:43.650070 4760 scope.go:117] "RemoveContainer" containerID="8515edbb2c907995012d5ce121ed488b6c2903314ed7219188998e1e84ed4f54" Sep 30 07:59:43 crc kubenswrapper[4760]: I0930 07:59:43.698281 4760 scope.go:117] "RemoveContainer" containerID="1684d34342baae3b11898ef8dd238f02ca25d71d91750bcf9b03335d830dfc10" Sep 30 07:59:43 crc kubenswrapper[4760]: I0930 07:59:43.736430 4760 scope.go:117] "RemoveContainer" containerID="981e42cdb61366b1cc2bac9302f14db150bd5d0c216a3c2c91bc6070ebf0c7b5" Sep 30 07:59:45 crc kubenswrapper[4760]: I0930 07:59:45.095935 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="225030d7-c116-4040-9b8f-69ad4d2e7a57" path="/var/lib/kubelet/pods/225030d7-c116-4040-9b8f-69ad4d2e7a57/volumes" Sep 30 07:59:45 crc kubenswrapper[4760]: I0930 07:59:45.097358 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc5900f2-47ff-45f3-870e-5aff13eeb14f" path="/var/lib/kubelet/pods/dc5900f2-47ff-45f3-870e-5aff13eeb14f/volumes" Sep 30 07:59:51 crc kubenswrapper[4760]: I0930 07:59:51.068016 4760 scope.go:117] "RemoveContainer" containerID="6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" Sep 30 07:59:51 crc kubenswrapper[4760]: E0930 07:59:51.069552 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:00:00 crc kubenswrapper[4760]: I0930 08:00:00.063430 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-439d-account-create-n6z9w"] Sep 30 08:00:00 crc kubenswrapper[4760]: I0930 08:00:00.073166 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-439d-account-create-n6z9w"] Sep 30 08:00:00 crc kubenswrapper[4760]: I0930 08:00:00.150475 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320320-z9zxw"] Sep 30 08:00:00 crc kubenswrapper[4760]: I0930 08:00:00.152057 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-z9zxw" Sep 30 08:00:00 crc kubenswrapper[4760]: I0930 08:00:00.155037 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 08:00:00 crc kubenswrapper[4760]: I0930 08:00:00.155167 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 08:00:00 crc kubenswrapper[4760]: I0930 08:00:00.175225 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320320-z9zxw"] Sep 30 08:00:00 crc kubenswrapper[4760]: I0930 08:00:00.318340 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2516ffe5-a86b-49ad-bb40-2481182ccdef-secret-volume\") pod \"collect-profiles-29320320-z9zxw\" (UID: \"2516ffe5-a86b-49ad-bb40-2481182ccdef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-z9zxw" Sep 30 08:00:00 crc kubenswrapper[4760]: I0930 08:00:00.318674 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2516ffe5-a86b-49ad-bb40-2481182ccdef-config-volume\") pod \"collect-profiles-29320320-z9zxw\" (UID: \"2516ffe5-a86b-49ad-bb40-2481182ccdef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-z9zxw" Sep 30 08:00:00 crc kubenswrapper[4760]: I0930 08:00:00.318694 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29vjb\" (UniqueName: \"kubernetes.io/projected/2516ffe5-a86b-49ad-bb40-2481182ccdef-kube-api-access-29vjb\") pod \"collect-profiles-29320320-z9zxw\" (UID: \"2516ffe5-a86b-49ad-bb40-2481182ccdef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-z9zxw" Sep 30 08:00:00 crc kubenswrapper[4760]: I0930 08:00:00.420313 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2516ffe5-a86b-49ad-bb40-2481182ccdef-secret-volume\") pod \"collect-profiles-29320320-z9zxw\" (UID: \"2516ffe5-a86b-49ad-bb40-2481182ccdef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-z9zxw" Sep 30 08:00:00 crc kubenswrapper[4760]: I0930 08:00:00.420404 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2516ffe5-a86b-49ad-bb40-2481182ccdef-config-volume\") pod \"collect-profiles-29320320-z9zxw\" (UID: \"2516ffe5-a86b-49ad-bb40-2481182ccdef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-z9zxw" Sep 30 08:00:00 crc kubenswrapper[4760]: I0930 08:00:00.420427 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29vjb\" (UniqueName: \"kubernetes.io/projected/2516ffe5-a86b-49ad-bb40-2481182ccdef-kube-api-access-29vjb\") pod \"collect-profiles-29320320-z9zxw\" (UID: \"2516ffe5-a86b-49ad-bb40-2481182ccdef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-z9zxw" Sep 30 08:00:00 crc kubenswrapper[4760]: I0930 08:00:00.421258 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2516ffe5-a86b-49ad-bb40-2481182ccdef-config-volume\") pod \"collect-profiles-29320320-z9zxw\" (UID: \"2516ffe5-a86b-49ad-bb40-2481182ccdef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-z9zxw" Sep 30 08:00:00 crc kubenswrapper[4760]: I0930 08:00:00.433333 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2516ffe5-a86b-49ad-bb40-2481182ccdef-secret-volume\") pod \"collect-profiles-29320320-z9zxw\" (UID: \"2516ffe5-a86b-49ad-bb40-2481182ccdef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-z9zxw" Sep 30 08:00:00 crc kubenswrapper[4760]: I0930 08:00:00.437399 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29vjb\" (UniqueName: \"kubernetes.io/projected/2516ffe5-a86b-49ad-bb40-2481182ccdef-kube-api-access-29vjb\") pod \"collect-profiles-29320320-z9zxw\" (UID: \"2516ffe5-a86b-49ad-bb40-2481182ccdef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-z9zxw" Sep 30 08:00:00 crc kubenswrapper[4760]: I0930 08:00:00.476504 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-z9zxw" Sep 30 08:00:00 crc kubenswrapper[4760]: I0930 08:00:00.913639 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320320-z9zxw"] Sep 30 08:00:01 crc kubenswrapper[4760]: I0930 08:00:01.082743 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b41f3bf7-e274-44ad-b745-c723e31a5167" path="/var/lib/kubelet/pods/b41f3bf7-e274-44ad-b745-c723e31a5167/volumes" Sep 30 08:00:01 crc kubenswrapper[4760]: I0930 08:00:01.650399 4760 generic.go:334] "Generic (PLEG): container finished" podID="2516ffe5-a86b-49ad-bb40-2481182ccdef" containerID="9871396eaf08403a099081a357f0f57fdb69af97b635a87c5fd9ba0b47d0a542" exitCode=0 Sep 30 08:00:01 crc kubenswrapper[4760]: I0930 08:00:01.650442 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-z9zxw" event={"ID":"2516ffe5-a86b-49ad-bb40-2481182ccdef","Type":"ContainerDied","Data":"9871396eaf08403a099081a357f0f57fdb69af97b635a87c5fd9ba0b47d0a542"} Sep 30 08:00:01 crc kubenswrapper[4760]: I0930 08:00:01.650468 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-z9zxw" event={"ID":"2516ffe5-a86b-49ad-bb40-2481182ccdef","Type":"ContainerStarted","Data":"041c468d5fa25eabcd5030bbc286191f5eaca73ee0cf27872ee132854fed1904"} Sep 30 08:00:03 crc kubenswrapper[4760]: I0930 08:00:03.048645 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-z9zxw" Sep 30 08:00:03 crc kubenswrapper[4760]: I0930 08:00:03.077805 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2516ffe5-a86b-49ad-bb40-2481182ccdef-secret-volume\") pod \"2516ffe5-a86b-49ad-bb40-2481182ccdef\" (UID: \"2516ffe5-a86b-49ad-bb40-2481182ccdef\") " Sep 30 08:00:03 crc kubenswrapper[4760]: I0930 08:00:03.089537 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2516ffe5-a86b-49ad-bb40-2481182ccdef-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2516ffe5-a86b-49ad-bb40-2481182ccdef" (UID: "2516ffe5-a86b-49ad-bb40-2481182ccdef"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:00:03 crc kubenswrapper[4760]: I0930 08:00:03.180198 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2516ffe5-a86b-49ad-bb40-2481182ccdef-config-volume\") pod \"2516ffe5-a86b-49ad-bb40-2481182ccdef\" (UID: \"2516ffe5-a86b-49ad-bb40-2481182ccdef\") " Sep 30 08:00:03 crc kubenswrapper[4760]: I0930 08:00:03.180251 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29vjb\" (UniqueName: \"kubernetes.io/projected/2516ffe5-a86b-49ad-bb40-2481182ccdef-kube-api-access-29vjb\") pod \"2516ffe5-a86b-49ad-bb40-2481182ccdef\" (UID: \"2516ffe5-a86b-49ad-bb40-2481182ccdef\") " Sep 30 08:00:03 crc kubenswrapper[4760]: I0930 08:00:03.180889 4760 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2516ffe5-a86b-49ad-bb40-2481182ccdef-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 08:00:03 crc kubenswrapper[4760]: I0930 08:00:03.180908 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2516ffe5-a86b-49ad-bb40-2481182ccdef-config-volume" (OuterVolumeSpecName: "config-volume") pod "2516ffe5-a86b-49ad-bb40-2481182ccdef" (UID: "2516ffe5-a86b-49ad-bb40-2481182ccdef"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 08:00:03 crc kubenswrapper[4760]: I0930 08:00:03.182990 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2516ffe5-a86b-49ad-bb40-2481182ccdef-kube-api-access-29vjb" (OuterVolumeSpecName: "kube-api-access-29vjb") pod "2516ffe5-a86b-49ad-bb40-2481182ccdef" (UID: "2516ffe5-a86b-49ad-bb40-2481182ccdef"). InnerVolumeSpecName "kube-api-access-29vjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:00:03 crc kubenswrapper[4760]: I0930 08:00:03.282314 4760 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2516ffe5-a86b-49ad-bb40-2481182ccdef-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 08:00:03 crc kubenswrapper[4760]: I0930 08:00:03.282362 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29vjb\" (UniqueName: \"kubernetes.io/projected/2516ffe5-a86b-49ad-bb40-2481182ccdef-kube-api-access-29vjb\") on node \"crc\" DevicePath \"\"" Sep 30 08:00:03 crc kubenswrapper[4760]: I0930 08:00:03.672114 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-z9zxw" event={"ID":"2516ffe5-a86b-49ad-bb40-2481182ccdef","Type":"ContainerDied","Data":"041c468d5fa25eabcd5030bbc286191f5eaca73ee0cf27872ee132854fed1904"} Sep 30 08:00:03 crc kubenswrapper[4760]: I0930 08:00:03.672163 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="041c468d5fa25eabcd5030bbc286191f5eaca73ee0cf27872ee132854fed1904" Sep 30 08:00:03 crc kubenswrapper[4760]: I0930 08:00:03.672214 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-z9zxw" Sep 30 08:00:04 crc kubenswrapper[4760]: I0930 08:00:04.068170 4760 scope.go:117] "RemoveContainer" containerID="6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" Sep 30 08:00:04 crc kubenswrapper[4760]: E0930 08:00:04.069056 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:00:05 crc kubenswrapper[4760]: I0930 08:00:05.037674 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-85j42"] Sep 30 08:00:05 crc kubenswrapper[4760]: I0930 08:00:05.046284 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-g2fkc"] Sep 30 08:00:05 crc kubenswrapper[4760]: I0930 08:00:05.054894 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-njd24"] Sep 30 08:00:05 crc kubenswrapper[4760]: I0930 08:00:05.063459 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-g2fkc"] Sep 30 08:00:05 crc kubenswrapper[4760]: I0930 08:00:05.079586 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="131df663-7ab5-42b0-8d39-9633a47f5d4c" path="/var/lib/kubelet/pods/131df663-7ab5-42b0-8d39-9633a47f5d4c/volumes" Sep 30 08:00:05 crc kubenswrapper[4760]: I0930 08:00:05.080269 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-njd24"] Sep 30 08:00:05 crc kubenswrapper[4760]: I0930 08:00:05.080321 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-85j42"] Sep 30 08:00:07 crc kubenswrapper[4760]: I0930 08:00:07.080719 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c7e9e65-e505-4c0e-bc41-53e420d499ef" path="/var/lib/kubelet/pods/7c7e9e65-e505-4c0e-bc41-53e420d499ef/volumes" Sep 30 08:00:07 crc kubenswrapper[4760]: I0930 08:00:07.082082 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88a5e2ea-3a8b-4b24-a152-10d0811414c8" path="/var/lib/kubelet/pods/88a5e2ea-3a8b-4b24-a152-10d0811414c8/volumes" Sep 30 08:00:14 crc kubenswrapper[4760]: I0930 08:00:14.038602 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-kxxsc"] Sep 30 08:00:14 crc kubenswrapper[4760]: I0930 08:00:14.056433 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-kxxsc"] Sep 30 08:00:15 crc kubenswrapper[4760]: I0930 08:00:15.077794 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc12fba5-2742-45fd-b63c-51b3201acc0a" path="/var/lib/kubelet/pods/fc12fba5-2742-45fd-b63c-51b3201acc0a/volumes" Sep 30 08:00:18 crc kubenswrapper[4760]: I0930 08:00:18.067468 4760 scope.go:117] "RemoveContainer" containerID="6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" Sep 30 08:00:18 crc kubenswrapper[4760]: E0930 08:00:18.068056 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:00:21 crc kubenswrapper[4760]: I0930 08:00:21.870595 4760 generic.go:334] "Generic (PLEG): container finished" podID="87fc7cca-6571-4e27-ab1e-14648064566e" containerID="8553169d43699b122e7ac80d578f16f226819141285bc61337350327c1531725" exitCode=0 Sep 30 08:00:21 crc kubenswrapper[4760]: I0930 08:00:21.871289 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd" event={"ID":"87fc7cca-6571-4e27-ab1e-14648064566e","Type":"ContainerDied","Data":"8553169d43699b122e7ac80d578f16f226819141285bc61337350327c1531725"} Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.046415 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-2f94-account-create-v8rw4"] Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.082649 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-b248-account-create-rd7nr"] Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.087222 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-2f94-account-create-v8rw4"] Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.097775 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-mpstv"] Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.106589 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-b248-account-create-rd7nr"] Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.116643 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-4a5c-account-create-74j84"] Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.126131 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-mpstv"] Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.137236 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-4a5c-account-create-74j84"] Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.268896 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd" Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.317209 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wq2qg\" (UniqueName: \"kubernetes.io/projected/87fc7cca-6571-4e27-ab1e-14648064566e-kube-api-access-wq2qg\") pod \"87fc7cca-6571-4e27-ab1e-14648064566e\" (UID: \"87fc7cca-6571-4e27-ab1e-14648064566e\") " Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.317404 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87fc7cca-6571-4e27-ab1e-14648064566e-inventory\") pod \"87fc7cca-6571-4e27-ab1e-14648064566e\" (UID: \"87fc7cca-6571-4e27-ab1e-14648064566e\") " Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.317522 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87fc7cca-6571-4e27-ab1e-14648064566e-ssh-key\") pod \"87fc7cca-6571-4e27-ab1e-14648064566e\" (UID: \"87fc7cca-6571-4e27-ab1e-14648064566e\") " Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.324499 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87fc7cca-6571-4e27-ab1e-14648064566e-kube-api-access-wq2qg" (OuterVolumeSpecName: "kube-api-access-wq2qg") pod "87fc7cca-6571-4e27-ab1e-14648064566e" (UID: "87fc7cca-6571-4e27-ab1e-14648064566e"). InnerVolumeSpecName "kube-api-access-wq2qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.347841 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87fc7cca-6571-4e27-ab1e-14648064566e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "87fc7cca-6571-4e27-ab1e-14648064566e" (UID: "87fc7cca-6571-4e27-ab1e-14648064566e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.349825 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87fc7cca-6571-4e27-ab1e-14648064566e-inventory" (OuterVolumeSpecName: "inventory") pod "87fc7cca-6571-4e27-ab1e-14648064566e" (UID: "87fc7cca-6571-4e27-ab1e-14648064566e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.420123 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87fc7cca-6571-4e27-ab1e-14648064566e-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.420208 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87fc7cca-6571-4e27-ab1e-14648064566e-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.420247 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wq2qg\" (UniqueName: \"kubernetes.io/projected/87fc7cca-6571-4e27-ab1e-14648064566e-kube-api-access-wq2qg\") on node \"crc\" DevicePath \"\"" Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.891955 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd" event={"ID":"87fc7cca-6571-4e27-ab1e-14648064566e","Type":"ContainerDied","Data":"9623cc9db93aafb065503c3e7033dcd499c230b2bb199cc1f1ab4cc0837b8954"} Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.892340 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9623cc9db93aafb065503c3e7033dcd499c230b2bb199cc1f1ab4cc0837b8954" Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.892035 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd" Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.975292 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4"] Sep 30 08:00:23 crc kubenswrapper[4760]: E0930 08:00:23.975964 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87fc7cca-6571-4e27-ab1e-14648064566e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.975994 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="87fc7cca-6571-4e27-ab1e-14648064566e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Sep 30 08:00:23 crc kubenswrapper[4760]: E0930 08:00:23.976015 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2516ffe5-a86b-49ad-bb40-2481182ccdef" containerName="collect-profiles" Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.976023 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2516ffe5-a86b-49ad-bb40-2481182ccdef" containerName="collect-profiles" Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.976320 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2516ffe5-a86b-49ad-bb40-2481182ccdef" containerName="collect-profiles" Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.976363 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="87fc7cca-6571-4e27-ab1e-14648064566e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.977246 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4" Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.983361 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4"] Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.983867 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.984761 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8gxrf" Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.984948 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 08:00:23 crc kubenswrapper[4760]: I0930 08:00:23.985020 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 08:00:24 crc kubenswrapper[4760]: I0930 08:00:24.032333 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2439cde-d5f2-423a-9e6d-4af8d713c917-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4\" (UID: \"e2439cde-d5f2-423a-9e6d-4af8d713c917\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4" Sep 30 08:00:24 crc kubenswrapper[4760]: I0930 08:00:24.032502 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2439cde-d5f2-423a-9e6d-4af8d713c917-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4\" (UID: \"e2439cde-d5f2-423a-9e6d-4af8d713c917\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4" Sep 30 08:00:24 crc kubenswrapper[4760]: I0930 08:00:24.032585 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl4rh\" (UniqueName: \"kubernetes.io/projected/e2439cde-d5f2-423a-9e6d-4af8d713c917-kube-api-access-xl4rh\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4\" (UID: \"e2439cde-d5f2-423a-9e6d-4af8d713c917\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4" Sep 30 08:00:24 crc kubenswrapper[4760]: I0930 08:00:24.135409 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2439cde-d5f2-423a-9e6d-4af8d713c917-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4\" (UID: \"e2439cde-d5f2-423a-9e6d-4af8d713c917\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4" Sep 30 08:00:24 crc kubenswrapper[4760]: I0930 08:00:24.135489 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2439cde-d5f2-423a-9e6d-4af8d713c917-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4\" (UID: \"e2439cde-d5f2-423a-9e6d-4af8d713c917\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4" Sep 30 08:00:24 crc kubenswrapper[4760]: I0930 08:00:24.135532 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl4rh\" (UniqueName: \"kubernetes.io/projected/e2439cde-d5f2-423a-9e6d-4af8d713c917-kube-api-access-xl4rh\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4\" (UID: \"e2439cde-d5f2-423a-9e6d-4af8d713c917\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4" Sep 30 08:00:24 crc kubenswrapper[4760]: I0930 08:00:24.141969 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2439cde-d5f2-423a-9e6d-4af8d713c917-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4\" (UID: \"e2439cde-d5f2-423a-9e6d-4af8d713c917\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4" Sep 30 08:00:24 crc kubenswrapper[4760]: I0930 08:00:24.142845 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2439cde-d5f2-423a-9e6d-4af8d713c917-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4\" (UID: \"e2439cde-d5f2-423a-9e6d-4af8d713c917\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4" Sep 30 08:00:24 crc kubenswrapper[4760]: I0930 08:00:24.153367 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl4rh\" (UniqueName: \"kubernetes.io/projected/e2439cde-d5f2-423a-9e6d-4af8d713c917-kube-api-access-xl4rh\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4\" (UID: \"e2439cde-d5f2-423a-9e6d-4af8d713c917\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4" Sep 30 08:00:24 crc kubenswrapper[4760]: I0930 08:00:24.304952 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4" Sep 30 08:00:24 crc kubenswrapper[4760]: I0930 08:00:24.858281 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4"] Sep 30 08:00:24 crc kubenswrapper[4760]: I0930 08:00:24.903724 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4" event={"ID":"e2439cde-d5f2-423a-9e6d-4af8d713c917","Type":"ContainerStarted","Data":"4aeb358827626f643a820bd69738dc1af465430bfbb873603c873fbd04bb35d1"} Sep 30 08:00:25 crc kubenswrapper[4760]: I0930 08:00:25.077999 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="218f5d61-bfb1-4609-9296-4a5b6471ea56" path="/var/lib/kubelet/pods/218f5d61-bfb1-4609-9296-4a5b6471ea56/volumes" Sep 30 08:00:25 crc kubenswrapper[4760]: I0930 08:00:25.078596 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7390084a-0022-4943-914f-fdb71e7ec326" path="/var/lib/kubelet/pods/7390084a-0022-4943-914f-fdb71e7ec326/volumes" Sep 30 08:00:25 crc kubenswrapper[4760]: I0930 08:00:25.079156 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9ce45c7-d994-44b1-9f88-297bac4ae9c2" path="/var/lib/kubelet/pods/a9ce45c7-d994-44b1-9f88-297bac4ae9c2/volumes" Sep 30 08:00:25 crc kubenswrapper[4760]: I0930 08:00:25.079698 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf25d474-c105-4c8b-87ad-0911e245056f" path="/var/lib/kubelet/pods/cf25d474-c105-4c8b-87ad-0911e245056f/volumes" Sep 30 08:00:25 crc kubenswrapper[4760]: I0930 08:00:25.919239 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4" event={"ID":"e2439cde-d5f2-423a-9e6d-4af8d713c917","Type":"ContainerStarted","Data":"053b3da9332cd0a7db9877665f1249d8ae420645eb3fbaa9d4e51c6a822d3d1f"} Sep 30 08:00:25 crc kubenswrapper[4760]: I0930 08:00:25.947726 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4" podStartSLOduration=2.259607415 podStartE2EDuration="2.947680943s" podCreationTimestamp="2025-09-30 08:00:23 +0000 UTC" firstStartedPulling="2025-09-30 08:00:24.869762119 +0000 UTC m=+1610.512668531" lastFinishedPulling="2025-09-30 08:00:25.557835637 +0000 UTC m=+1611.200742059" observedRunningTime="2025-09-30 08:00:25.939618957 +0000 UTC m=+1611.582525409" watchObservedRunningTime="2025-09-30 08:00:25.947680943 +0000 UTC m=+1611.590587355" Sep 30 08:00:30 crc kubenswrapper[4760]: I0930 08:00:30.068078 4760 scope.go:117] "RemoveContainer" containerID="6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" Sep 30 08:00:30 crc kubenswrapper[4760]: E0930 08:00:30.068984 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:00:32 crc kubenswrapper[4760]: I0930 08:00:32.030377 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-s45t8"] Sep 30 08:00:32 crc kubenswrapper[4760]: I0930 08:00:32.039113 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-s45t8"] Sep 30 08:00:33 crc kubenswrapper[4760]: I0930 08:00:33.080732 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09c9e208-48cc-44b8-9810-fc0cf69cea8a" path="/var/lib/kubelet/pods/09c9e208-48cc-44b8-9810-fc0cf69cea8a/volumes" Sep 30 08:00:40 crc kubenswrapper[4760]: I0930 08:00:40.458319 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kg7zj"] Sep 30 08:00:40 crc kubenswrapper[4760]: I0930 08:00:40.461108 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kg7zj" Sep 30 08:00:40 crc kubenswrapper[4760]: I0930 08:00:40.466568 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd7gj\" (UniqueName: \"kubernetes.io/projected/f8187248-878a-494d-bfce-56d54c403561-kube-api-access-gd7gj\") pod \"redhat-operators-kg7zj\" (UID: \"f8187248-878a-494d-bfce-56d54c403561\") " pod="openshift-marketplace/redhat-operators-kg7zj" Sep 30 08:00:40 crc kubenswrapper[4760]: I0930 08:00:40.466680 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8187248-878a-494d-bfce-56d54c403561-utilities\") pod \"redhat-operators-kg7zj\" (UID: \"f8187248-878a-494d-bfce-56d54c403561\") " pod="openshift-marketplace/redhat-operators-kg7zj" Sep 30 08:00:40 crc kubenswrapper[4760]: I0930 08:00:40.466825 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8187248-878a-494d-bfce-56d54c403561-catalog-content\") pod \"redhat-operators-kg7zj\" (UID: \"f8187248-878a-494d-bfce-56d54c403561\") " pod="openshift-marketplace/redhat-operators-kg7zj" Sep 30 08:00:40 crc kubenswrapper[4760]: I0930 08:00:40.474614 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kg7zj"] Sep 30 08:00:40 crc kubenswrapper[4760]: I0930 08:00:40.568759 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd7gj\" (UniqueName: \"kubernetes.io/projected/f8187248-878a-494d-bfce-56d54c403561-kube-api-access-gd7gj\") pod \"redhat-operators-kg7zj\" (UID: \"f8187248-878a-494d-bfce-56d54c403561\") " pod="openshift-marketplace/redhat-operators-kg7zj" Sep 30 08:00:40 crc kubenswrapper[4760]: I0930 08:00:40.568887 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8187248-878a-494d-bfce-56d54c403561-utilities\") pod \"redhat-operators-kg7zj\" (UID: \"f8187248-878a-494d-bfce-56d54c403561\") " pod="openshift-marketplace/redhat-operators-kg7zj" Sep 30 08:00:40 crc kubenswrapper[4760]: I0930 08:00:40.568969 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8187248-878a-494d-bfce-56d54c403561-catalog-content\") pod \"redhat-operators-kg7zj\" (UID: \"f8187248-878a-494d-bfce-56d54c403561\") " pod="openshift-marketplace/redhat-operators-kg7zj" Sep 30 08:00:40 crc kubenswrapper[4760]: I0930 08:00:40.569545 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8187248-878a-494d-bfce-56d54c403561-catalog-content\") pod \"redhat-operators-kg7zj\" (UID: \"f8187248-878a-494d-bfce-56d54c403561\") " pod="openshift-marketplace/redhat-operators-kg7zj" Sep 30 08:00:40 crc kubenswrapper[4760]: I0930 08:00:40.569700 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8187248-878a-494d-bfce-56d54c403561-utilities\") pod \"redhat-operators-kg7zj\" (UID: \"f8187248-878a-494d-bfce-56d54c403561\") " pod="openshift-marketplace/redhat-operators-kg7zj" Sep 30 08:00:40 crc kubenswrapper[4760]: I0930 08:00:40.590597 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd7gj\" (UniqueName: \"kubernetes.io/projected/f8187248-878a-494d-bfce-56d54c403561-kube-api-access-gd7gj\") pod \"redhat-operators-kg7zj\" (UID: \"f8187248-878a-494d-bfce-56d54c403561\") " pod="openshift-marketplace/redhat-operators-kg7zj" Sep 30 08:00:40 crc kubenswrapper[4760]: I0930 08:00:40.789919 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kg7zj" Sep 30 08:00:41 crc kubenswrapper[4760]: I0930 08:00:41.043404 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-6g2pm"] Sep 30 08:00:41 crc kubenswrapper[4760]: I0930 08:00:41.051738 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-2npfw"] Sep 30 08:00:41 crc kubenswrapper[4760]: I0930 08:00:41.059230 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-2npfw"] Sep 30 08:00:41 crc kubenswrapper[4760]: I0930 08:00:41.090570 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4584a821-629e-4246-95d3-b84160a0f46c" path="/var/lib/kubelet/pods/4584a821-629e-4246-95d3-b84160a0f46c/volumes" Sep 30 08:00:41 crc kubenswrapper[4760]: I0930 08:00:41.091588 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-6g2pm"] Sep 30 08:00:41 crc kubenswrapper[4760]: W0930 08:00:41.306031 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8187248_878a_494d_bfce_56d54c403561.slice/crio-0c236aa7ac853c433b90119d1acbc78f45c721bbd60406ec5932fcd029e1f175 WatchSource:0}: Error finding container 0c236aa7ac853c433b90119d1acbc78f45c721bbd60406ec5932fcd029e1f175: Status 404 returned error can't find the container with id 0c236aa7ac853c433b90119d1acbc78f45c721bbd60406ec5932fcd029e1f175 Sep 30 08:00:41 crc kubenswrapper[4760]: I0930 08:00:41.308199 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kg7zj"] Sep 30 08:00:42 crc kubenswrapper[4760]: I0930 08:00:42.075897 4760 generic.go:334] "Generic (PLEG): container finished" podID="f8187248-878a-494d-bfce-56d54c403561" containerID="5129762066b843dcdfca0b71d08c258a4196036d9716454e89a3d73e9208e1ed" exitCode=0 Sep 30 08:00:42 crc kubenswrapper[4760]: I0930 08:00:42.076002 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kg7zj" event={"ID":"f8187248-878a-494d-bfce-56d54c403561","Type":"ContainerDied","Data":"5129762066b843dcdfca0b71d08c258a4196036d9716454e89a3d73e9208e1ed"} Sep 30 08:00:42 crc kubenswrapper[4760]: I0930 08:00:42.076208 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kg7zj" event={"ID":"f8187248-878a-494d-bfce-56d54c403561","Type":"ContainerStarted","Data":"0c236aa7ac853c433b90119d1acbc78f45c721bbd60406ec5932fcd029e1f175"} Sep 30 08:00:43 crc kubenswrapper[4760]: I0930 08:00:43.078207 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c" path="/var/lib/kubelet/pods/e8bc0e9c-dde5-4aa9-b199-0ad1e4e3d40c/volumes" Sep 30 08:00:43 crc kubenswrapper[4760]: I0930 08:00:43.901628 4760 scope.go:117] "RemoveContainer" containerID="1709f319e4cbbd5d960887fc4785bf4c61a3094232f12625c2012c901d224818" Sep 30 08:00:43 crc kubenswrapper[4760]: I0930 08:00:43.959141 4760 scope.go:117] "RemoveContainer" containerID="7ff8179eb22158ab5265a3f91f89d11c80bc5c5c69813ac725d688b5bbaf9c3c" Sep 30 08:00:43 crc kubenswrapper[4760]: I0930 08:00:43.986195 4760 scope.go:117] "RemoveContainer" containerID="78af48d95aa791006f27e6464dbb8f326f3071dc5a67d47fd213d189043f1128" Sep 30 08:00:44 crc kubenswrapper[4760]: I0930 08:00:44.023364 4760 scope.go:117] "RemoveContainer" containerID="bdd4ea835d0289cf353876c7bf6351106595165c513520cdcb0656f3058af201" Sep 30 08:00:44 crc kubenswrapper[4760]: I0930 08:00:44.062502 4760 scope.go:117] "RemoveContainer" containerID="be1a65183707097946e16a4218228c506f740928573e2b4f019b3143314b3217" Sep 30 08:00:44 crc kubenswrapper[4760]: I0930 08:00:44.118564 4760 scope.go:117] "RemoveContainer" containerID="239b861a506354fb750084154ec800cbd2d3fc062d4e82e7db73dd4f78aec132" Sep 30 08:00:44 crc kubenswrapper[4760]: I0930 08:00:44.154285 4760 scope.go:117] "RemoveContainer" containerID="8ff322d29972321c37183515cd622af66909c70fa1837cb7b3f96ab838b1554a" Sep 30 08:00:44 crc kubenswrapper[4760]: I0930 08:00:44.195282 4760 scope.go:117] "RemoveContainer" containerID="cd34ed4a3c768321b03d3c69ae8fc0b614c7141d5b4eb8f21020250d5c60803a" Sep 30 08:00:44 crc kubenswrapper[4760]: I0930 08:00:44.243579 4760 scope.go:117] "RemoveContainer" containerID="8579a4d68d22e9756e93b14f9362981dd05e3e81b78de136aeada899e94d83b4" Sep 30 08:00:44 crc kubenswrapper[4760]: I0930 08:00:44.290406 4760 scope.go:117] "RemoveContainer" containerID="17d3b399aae2293488f77d73df6ba2f7106ed3ea8b3dd4f3ad4cd312eff74859" Sep 30 08:00:44 crc kubenswrapper[4760]: I0930 08:00:44.317022 4760 scope.go:117] "RemoveContainer" containerID="721af8f3bf9fcd4398eff01bb2f0400eb09be862644cfd2d14af12687ba93474" Sep 30 08:00:44 crc kubenswrapper[4760]: I0930 08:00:44.337451 4760 scope.go:117] "RemoveContainer" containerID="7e853395a6f92a9d0cf4dcbc0d905e87c7273008f6710570a6fc90f693e13d50" Sep 30 08:00:44 crc kubenswrapper[4760]: I0930 08:00:44.359960 4760 scope.go:117] "RemoveContainer" containerID="07e370e1283fea479cd7b02061f78e0b1d8778265c1bd0c690a3650449e5f549" Sep 30 08:00:44 crc kubenswrapper[4760]: I0930 08:00:44.394067 4760 scope.go:117] "RemoveContainer" containerID="8fde0ea6afbe64eb5fbab5f1ac40c61b6f6051c1561f673a2a9c5a166086ce08" Sep 30 08:00:45 crc kubenswrapper[4760]: I0930 08:00:45.075553 4760 scope.go:117] "RemoveContainer" containerID="6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" Sep 30 08:00:45 crc kubenswrapper[4760]: E0930 08:00:45.076143 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:00:49 crc kubenswrapper[4760]: I0930 08:00:49.156767 4760 generic.go:334] "Generic (PLEG): container finished" podID="f8187248-878a-494d-bfce-56d54c403561" containerID="98c55b347535fa5f29d172bfd33190398cd759ac6a448c0e2f3170761063ca3b" exitCode=0 Sep 30 08:00:49 crc kubenswrapper[4760]: I0930 08:00:49.156834 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kg7zj" event={"ID":"f8187248-878a-494d-bfce-56d54c403561","Type":"ContainerDied","Data":"98c55b347535fa5f29d172bfd33190398cd759ac6a448c0e2f3170761063ca3b"} Sep 30 08:00:50 crc kubenswrapper[4760]: I0930 08:00:50.167466 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kg7zj" event={"ID":"f8187248-878a-494d-bfce-56d54c403561","Type":"ContainerStarted","Data":"b19e6b3b568d4063aa403ade20d9863e4b5eab3541306779c904ce4508822fdb"} Sep 30 08:00:50 crc kubenswrapper[4760]: I0930 08:00:50.190808 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kg7zj" podStartSLOduration=2.5769658890000002 podStartE2EDuration="10.190786245s" podCreationTimestamp="2025-09-30 08:00:40 +0000 UTC" firstStartedPulling="2025-09-30 08:00:42.077397253 +0000 UTC m=+1627.720303685" lastFinishedPulling="2025-09-30 08:00:49.691217589 +0000 UTC m=+1635.334124041" observedRunningTime="2025-09-30 08:00:50.188399544 +0000 UTC m=+1635.831305946" watchObservedRunningTime="2025-09-30 08:00:50.190786245 +0000 UTC m=+1635.833692667" Sep 30 08:00:50 crc kubenswrapper[4760]: I0930 08:00:50.790254 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kg7zj" Sep 30 08:00:50 crc kubenswrapper[4760]: I0930 08:00:50.790436 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kg7zj" Sep 30 08:00:51 crc kubenswrapper[4760]: I0930 08:00:51.885116 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kg7zj" podUID="f8187248-878a-494d-bfce-56d54c403561" containerName="registry-server" probeResult="failure" output=< Sep 30 08:00:51 crc kubenswrapper[4760]: timeout: failed to connect service ":50051" within 1s Sep 30 08:00:51 crc kubenswrapper[4760]: > Sep 30 08:00:54 crc kubenswrapper[4760]: I0930 08:00:54.062689 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-f6j6q"] Sep 30 08:00:54 crc kubenswrapper[4760]: I0930 08:00:54.076228 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-f6j6q"] Sep 30 08:00:55 crc kubenswrapper[4760]: I0930 08:00:55.079679 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1df67641-4598-4ba5-a59a-a195084e5446" path="/var/lib/kubelet/pods/1df67641-4598-4ba5-a59a-a195084e5446/volumes" Sep 30 08:00:59 crc kubenswrapper[4760]: I0930 08:00:59.068017 4760 scope.go:117] "RemoveContainer" containerID="6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" Sep 30 08:00:59 crc kubenswrapper[4760]: E0930 08:00:59.070464 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:01:00 crc kubenswrapper[4760]: I0930 08:01:00.163719 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29320321-w9cp7"] Sep 30 08:01:00 crc kubenswrapper[4760]: I0930 08:01:00.165287 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320321-w9cp7" Sep 30 08:01:00 crc kubenswrapper[4760]: I0930 08:01:00.175427 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320321-w9cp7"] Sep 30 08:01:00 crc kubenswrapper[4760]: I0930 08:01:00.194338 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xm7m\" (UniqueName: \"kubernetes.io/projected/2a6d4144-48a7-412d-9288-a909f1fbd5f4-kube-api-access-5xm7m\") pod \"keystone-cron-29320321-w9cp7\" (UID: \"2a6d4144-48a7-412d-9288-a909f1fbd5f4\") " pod="openstack/keystone-cron-29320321-w9cp7" Sep 30 08:01:00 crc kubenswrapper[4760]: I0930 08:01:00.194398 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a6d4144-48a7-412d-9288-a909f1fbd5f4-fernet-keys\") pod \"keystone-cron-29320321-w9cp7\" (UID: \"2a6d4144-48a7-412d-9288-a909f1fbd5f4\") " pod="openstack/keystone-cron-29320321-w9cp7" Sep 30 08:01:00 crc kubenswrapper[4760]: I0930 08:01:00.194503 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a6d4144-48a7-412d-9288-a909f1fbd5f4-combined-ca-bundle\") pod \"keystone-cron-29320321-w9cp7\" (UID: \"2a6d4144-48a7-412d-9288-a909f1fbd5f4\") " pod="openstack/keystone-cron-29320321-w9cp7" Sep 30 08:01:00 crc kubenswrapper[4760]: I0930 08:01:00.194522 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a6d4144-48a7-412d-9288-a909f1fbd5f4-config-data\") pod \"keystone-cron-29320321-w9cp7\" (UID: \"2a6d4144-48a7-412d-9288-a909f1fbd5f4\") " pod="openstack/keystone-cron-29320321-w9cp7" Sep 30 08:01:00 crc kubenswrapper[4760]: I0930 08:01:00.296901 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a6d4144-48a7-412d-9288-a909f1fbd5f4-combined-ca-bundle\") pod \"keystone-cron-29320321-w9cp7\" (UID: \"2a6d4144-48a7-412d-9288-a909f1fbd5f4\") " pod="openstack/keystone-cron-29320321-w9cp7" Sep 30 08:01:00 crc kubenswrapper[4760]: I0930 08:01:00.297238 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a6d4144-48a7-412d-9288-a909f1fbd5f4-config-data\") pod \"keystone-cron-29320321-w9cp7\" (UID: \"2a6d4144-48a7-412d-9288-a909f1fbd5f4\") " pod="openstack/keystone-cron-29320321-w9cp7" Sep 30 08:01:00 crc kubenswrapper[4760]: I0930 08:01:00.297505 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xm7m\" (UniqueName: \"kubernetes.io/projected/2a6d4144-48a7-412d-9288-a909f1fbd5f4-kube-api-access-5xm7m\") pod \"keystone-cron-29320321-w9cp7\" (UID: \"2a6d4144-48a7-412d-9288-a909f1fbd5f4\") " pod="openstack/keystone-cron-29320321-w9cp7" Sep 30 08:01:00 crc kubenswrapper[4760]: I0930 08:01:00.297648 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a6d4144-48a7-412d-9288-a909f1fbd5f4-fernet-keys\") pod \"keystone-cron-29320321-w9cp7\" (UID: \"2a6d4144-48a7-412d-9288-a909f1fbd5f4\") " pod="openstack/keystone-cron-29320321-w9cp7" Sep 30 08:01:00 crc kubenswrapper[4760]: I0930 08:01:00.305401 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a6d4144-48a7-412d-9288-a909f1fbd5f4-combined-ca-bundle\") pod \"keystone-cron-29320321-w9cp7\" (UID: \"2a6d4144-48a7-412d-9288-a909f1fbd5f4\") " pod="openstack/keystone-cron-29320321-w9cp7" Sep 30 08:01:00 crc kubenswrapper[4760]: I0930 08:01:00.305525 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a6d4144-48a7-412d-9288-a909f1fbd5f4-fernet-keys\") pod \"keystone-cron-29320321-w9cp7\" (UID: \"2a6d4144-48a7-412d-9288-a909f1fbd5f4\") " pod="openstack/keystone-cron-29320321-w9cp7" Sep 30 08:01:00 crc kubenswrapper[4760]: I0930 08:01:00.306543 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a6d4144-48a7-412d-9288-a909f1fbd5f4-config-data\") pod \"keystone-cron-29320321-w9cp7\" (UID: \"2a6d4144-48a7-412d-9288-a909f1fbd5f4\") " pod="openstack/keystone-cron-29320321-w9cp7" Sep 30 08:01:00 crc kubenswrapper[4760]: I0930 08:01:00.314438 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xm7m\" (UniqueName: \"kubernetes.io/projected/2a6d4144-48a7-412d-9288-a909f1fbd5f4-kube-api-access-5xm7m\") pod \"keystone-cron-29320321-w9cp7\" (UID: \"2a6d4144-48a7-412d-9288-a909f1fbd5f4\") " pod="openstack/keystone-cron-29320321-w9cp7" Sep 30 08:01:00 crc kubenswrapper[4760]: I0930 08:01:00.492706 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320321-w9cp7" Sep 30 08:01:00 crc kubenswrapper[4760]: I0930 08:01:00.847900 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kg7zj" Sep 30 08:01:00 crc kubenswrapper[4760]: I0930 08:01:00.895396 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kg7zj" Sep 30 08:01:00 crc kubenswrapper[4760]: I0930 08:01:00.955096 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320321-w9cp7"] Sep 30 08:01:00 crc kubenswrapper[4760]: I0930 08:01:00.993712 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kg7zj"] Sep 30 08:01:01 crc kubenswrapper[4760]: I0930 08:01:01.093614 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cgrmz"] Sep 30 08:01:01 crc kubenswrapper[4760]: I0930 08:01:01.094132 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cgrmz" podUID="2af6d8e2-56f6-47ec-9ee3-fe00eeb022df" containerName="registry-server" containerID="cri-o://9b222ed2e2b1d8cfd1367adce834c59a588acaccb02e0ea0ed88341371a25f26" gracePeriod=2 Sep 30 08:01:01 crc kubenswrapper[4760]: I0930 08:01:01.278884 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320321-w9cp7" event={"ID":"2a6d4144-48a7-412d-9288-a909f1fbd5f4","Type":"ContainerStarted","Data":"d69442ba5d4c654fc36fafddbb6de41f9406a78295d31800bd2c208563273fbc"} Sep 30 08:01:01 crc kubenswrapper[4760]: I0930 08:01:01.278960 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320321-w9cp7" event={"ID":"2a6d4144-48a7-412d-9288-a909f1fbd5f4","Type":"ContainerStarted","Data":"4aae63e788199c3429926b8edc19e716f7945cb7eee2a96118f06cf604c638db"} Sep 30 08:01:01 crc kubenswrapper[4760]: I0930 08:01:01.298588 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29320321-w9cp7" podStartSLOduration=1.298570299 podStartE2EDuration="1.298570299s" podCreationTimestamp="2025-09-30 08:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 08:01:01.293417757 +0000 UTC m=+1646.936324169" watchObservedRunningTime="2025-09-30 08:01:01.298570299 +0000 UTC m=+1646.941476711" Sep 30 08:01:03 crc kubenswrapper[4760]: I0930 08:01:03.026183 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-8vvnb"] Sep 30 08:01:03 crc kubenswrapper[4760]: I0930 08:01:03.035117 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-8vvnb"] Sep 30 08:01:03 crc kubenswrapper[4760]: I0930 08:01:03.078478 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="108a4c03-5bd3-45d0-a13d-b67e01bd7654" path="/var/lib/kubelet/pods/108a4c03-5bd3-45d0-a13d-b67e01bd7654/volumes" Sep 30 08:01:04 crc kubenswrapper[4760]: I0930 08:01:04.317192 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cgrmz_2af6d8e2-56f6-47ec-9ee3-fe00eeb022df/registry-server/0.log" Sep 30 08:01:04 crc kubenswrapper[4760]: I0930 08:01:04.320425 4760 generic.go:334] "Generic (PLEG): container finished" podID="2af6d8e2-56f6-47ec-9ee3-fe00eeb022df" containerID="9b222ed2e2b1d8cfd1367adce834c59a588acaccb02e0ea0ed88341371a25f26" exitCode=137 Sep 30 08:01:04 crc kubenswrapper[4760]: I0930 08:01:04.320512 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgrmz" event={"ID":"2af6d8e2-56f6-47ec-9ee3-fe00eeb022df","Type":"ContainerDied","Data":"9b222ed2e2b1d8cfd1367adce834c59a588acaccb02e0ea0ed88341371a25f26"} Sep 30 08:01:04 crc kubenswrapper[4760]: I0930 08:01:04.320570 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgrmz" event={"ID":"2af6d8e2-56f6-47ec-9ee3-fe00eeb022df","Type":"ContainerDied","Data":"8318aedf958a89dac028d109857752c2901b956dbf8e013922e0678652211a3c"} Sep 30 08:01:04 crc kubenswrapper[4760]: I0930 08:01:04.320591 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8318aedf958a89dac028d109857752c2901b956dbf8e013922e0678652211a3c" Sep 30 08:01:04 crc kubenswrapper[4760]: I0930 08:01:04.322318 4760 generic.go:334] "Generic (PLEG): container finished" podID="2a6d4144-48a7-412d-9288-a909f1fbd5f4" containerID="d69442ba5d4c654fc36fafddbb6de41f9406a78295d31800bd2c208563273fbc" exitCode=0 Sep 30 08:01:04 crc kubenswrapper[4760]: I0930 08:01:04.322355 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320321-w9cp7" event={"ID":"2a6d4144-48a7-412d-9288-a909f1fbd5f4","Type":"ContainerDied","Data":"d69442ba5d4c654fc36fafddbb6de41f9406a78295d31800bd2c208563273fbc"} Sep 30 08:01:04 crc kubenswrapper[4760]: I0930 08:01:04.351612 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cgrmz_2af6d8e2-56f6-47ec-9ee3-fe00eeb022df/registry-server/0.log" Sep 30 08:01:04 crc kubenswrapper[4760]: I0930 08:01:04.352263 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cgrmz" Sep 30 08:01:04 crc kubenswrapper[4760]: I0930 08:01:04.490122 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2af6d8e2-56f6-47ec-9ee3-fe00eeb022df-utilities\") pod \"2af6d8e2-56f6-47ec-9ee3-fe00eeb022df\" (UID: \"2af6d8e2-56f6-47ec-9ee3-fe00eeb022df\") " Sep 30 08:01:04 crc kubenswrapper[4760]: I0930 08:01:04.490175 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gr5q\" (UniqueName: \"kubernetes.io/projected/2af6d8e2-56f6-47ec-9ee3-fe00eeb022df-kube-api-access-4gr5q\") pod \"2af6d8e2-56f6-47ec-9ee3-fe00eeb022df\" (UID: \"2af6d8e2-56f6-47ec-9ee3-fe00eeb022df\") " Sep 30 08:01:04 crc kubenswrapper[4760]: I0930 08:01:04.490290 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2af6d8e2-56f6-47ec-9ee3-fe00eeb022df-catalog-content\") pod \"2af6d8e2-56f6-47ec-9ee3-fe00eeb022df\" (UID: \"2af6d8e2-56f6-47ec-9ee3-fe00eeb022df\") " Sep 30 08:01:04 crc kubenswrapper[4760]: I0930 08:01:04.491666 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2af6d8e2-56f6-47ec-9ee3-fe00eeb022df-utilities" (OuterVolumeSpecName: "utilities") pod "2af6d8e2-56f6-47ec-9ee3-fe00eeb022df" (UID: "2af6d8e2-56f6-47ec-9ee3-fe00eeb022df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:01:04 crc kubenswrapper[4760]: I0930 08:01:04.499486 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af6d8e2-56f6-47ec-9ee3-fe00eeb022df-kube-api-access-4gr5q" (OuterVolumeSpecName: "kube-api-access-4gr5q") pod "2af6d8e2-56f6-47ec-9ee3-fe00eeb022df" (UID: "2af6d8e2-56f6-47ec-9ee3-fe00eeb022df"). InnerVolumeSpecName "kube-api-access-4gr5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:01:04 crc kubenswrapper[4760]: I0930 08:01:04.573433 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2af6d8e2-56f6-47ec-9ee3-fe00eeb022df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2af6d8e2-56f6-47ec-9ee3-fe00eeb022df" (UID: "2af6d8e2-56f6-47ec-9ee3-fe00eeb022df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:01:04 crc kubenswrapper[4760]: I0930 08:01:04.593049 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2af6d8e2-56f6-47ec-9ee3-fe00eeb022df-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 08:01:04 crc kubenswrapper[4760]: I0930 08:01:04.593086 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gr5q\" (UniqueName: \"kubernetes.io/projected/2af6d8e2-56f6-47ec-9ee3-fe00eeb022df-kube-api-access-4gr5q\") on node \"crc\" DevicePath \"\"" Sep 30 08:01:04 crc kubenswrapper[4760]: I0930 08:01:04.593098 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2af6d8e2-56f6-47ec-9ee3-fe00eeb022df-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 08:01:05 crc kubenswrapper[4760]: I0930 08:01:05.340089 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cgrmz" Sep 30 08:01:05 crc kubenswrapper[4760]: I0930 08:01:05.386725 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cgrmz"] Sep 30 08:01:05 crc kubenswrapper[4760]: I0930 08:01:05.402428 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cgrmz"] Sep 30 08:01:05 crc kubenswrapper[4760]: I0930 08:01:05.735170 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320321-w9cp7" Sep 30 08:01:05 crc kubenswrapper[4760]: I0930 08:01:05.920444 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a6d4144-48a7-412d-9288-a909f1fbd5f4-fernet-keys\") pod \"2a6d4144-48a7-412d-9288-a909f1fbd5f4\" (UID: \"2a6d4144-48a7-412d-9288-a909f1fbd5f4\") " Sep 30 08:01:05 crc kubenswrapper[4760]: I0930 08:01:05.920612 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xm7m\" (UniqueName: \"kubernetes.io/projected/2a6d4144-48a7-412d-9288-a909f1fbd5f4-kube-api-access-5xm7m\") pod \"2a6d4144-48a7-412d-9288-a909f1fbd5f4\" (UID: \"2a6d4144-48a7-412d-9288-a909f1fbd5f4\") " Sep 30 08:01:05 crc kubenswrapper[4760]: I0930 08:01:05.920675 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a6d4144-48a7-412d-9288-a909f1fbd5f4-config-data\") pod \"2a6d4144-48a7-412d-9288-a909f1fbd5f4\" (UID: \"2a6d4144-48a7-412d-9288-a909f1fbd5f4\") " Sep 30 08:01:05 crc kubenswrapper[4760]: I0930 08:01:05.920810 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a6d4144-48a7-412d-9288-a909f1fbd5f4-combined-ca-bundle\") pod \"2a6d4144-48a7-412d-9288-a909f1fbd5f4\" (UID: \"2a6d4144-48a7-412d-9288-a909f1fbd5f4\") " Sep 30 08:01:05 crc kubenswrapper[4760]: I0930 08:01:05.928259 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a6d4144-48a7-412d-9288-a909f1fbd5f4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2a6d4144-48a7-412d-9288-a909f1fbd5f4" (UID: "2a6d4144-48a7-412d-9288-a909f1fbd5f4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:01:05 crc kubenswrapper[4760]: I0930 08:01:05.928399 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a6d4144-48a7-412d-9288-a909f1fbd5f4-kube-api-access-5xm7m" (OuterVolumeSpecName: "kube-api-access-5xm7m") pod "2a6d4144-48a7-412d-9288-a909f1fbd5f4" (UID: "2a6d4144-48a7-412d-9288-a909f1fbd5f4"). InnerVolumeSpecName "kube-api-access-5xm7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:01:05 crc kubenswrapper[4760]: I0930 08:01:05.954722 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a6d4144-48a7-412d-9288-a909f1fbd5f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a6d4144-48a7-412d-9288-a909f1fbd5f4" (UID: "2a6d4144-48a7-412d-9288-a909f1fbd5f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:01:05 crc kubenswrapper[4760]: I0930 08:01:05.982721 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a6d4144-48a7-412d-9288-a909f1fbd5f4-config-data" (OuterVolumeSpecName: "config-data") pod "2a6d4144-48a7-412d-9288-a909f1fbd5f4" (UID: "2a6d4144-48a7-412d-9288-a909f1fbd5f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:01:06 crc kubenswrapper[4760]: I0930 08:01:06.023807 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a6d4144-48a7-412d-9288-a909f1fbd5f4-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 08:01:06 crc kubenswrapper[4760]: I0930 08:01:06.023871 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a6d4144-48a7-412d-9288-a909f1fbd5f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 08:01:06 crc kubenswrapper[4760]: I0930 08:01:06.023895 4760 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a6d4144-48a7-412d-9288-a909f1fbd5f4-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 08:01:06 crc kubenswrapper[4760]: I0930 08:01:06.023915 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xm7m\" (UniqueName: \"kubernetes.io/projected/2a6d4144-48a7-412d-9288-a909f1fbd5f4-kube-api-access-5xm7m\") on node \"crc\" DevicePath \"\"" Sep 30 08:01:06 crc kubenswrapper[4760]: I0930 08:01:06.352366 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320321-w9cp7" event={"ID":"2a6d4144-48a7-412d-9288-a909f1fbd5f4","Type":"ContainerDied","Data":"4aae63e788199c3429926b8edc19e716f7945cb7eee2a96118f06cf604c638db"} Sep 30 08:01:06 crc kubenswrapper[4760]: I0930 08:01:06.352425 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4aae63e788199c3429926b8edc19e716f7945cb7eee2a96118f06cf604c638db" Sep 30 08:01:06 crc kubenswrapper[4760]: I0930 08:01:06.352512 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320321-w9cp7" Sep 30 08:01:07 crc kubenswrapper[4760]: I0930 08:01:07.079376 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2af6d8e2-56f6-47ec-9ee3-fe00eeb022df" path="/var/lib/kubelet/pods/2af6d8e2-56f6-47ec-9ee3-fe00eeb022df/volumes" Sep 30 08:01:11 crc kubenswrapper[4760]: I0930 08:01:11.067412 4760 scope.go:117] "RemoveContainer" containerID="6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" Sep 30 08:01:11 crc kubenswrapper[4760]: E0930 08:01:11.068165 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:01:24 crc kubenswrapper[4760]: I0930 08:01:24.068253 4760 scope.go:117] "RemoveContainer" containerID="6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" Sep 30 08:01:24 crc kubenswrapper[4760]: E0930 08:01:24.069056 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:01:33 crc kubenswrapper[4760]: I0930 08:01:33.039689 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-9ml2s"] Sep 30 08:01:33 crc kubenswrapper[4760]: I0930 08:01:33.047321 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-9ml2s"] Sep 30 08:01:33 crc kubenswrapper[4760]: I0930 08:01:33.078768 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0" path="/var/lib/kubelet/pods/4a2c7f25-ec97-4cb0-8475-b7b8d4be47d0/volumes" Sep 30 08:01:36 crc kubenswrapper[4760]: I0930 08:01:36.035125 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-fsvjc"] Sep 30 08:01:36 crc kubenswrapper[4760]: I0930 08:01:36.052994 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-6cjrj"] Sep 30 08:01:36 crc kubenswrapper[4760]: I0930 08:01:36.062234 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-cs5kx"] Sep 30 08:01:36 crc kubenswrapper[4760]: I0930 08:01:36.070670 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-fsvjc"] Sep 30 08:01:36 crc kubenswrapper[4760]: I0930 08:01:36.078694 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-cs5kx"] Sep 30 08:01:36 crc kubenswrapper[4760]: I0930 08:01:36.090407 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-6cjrj"] Sep 30 08:01:37 crc kubenswrapper[4760]: I0930 08:01:37.081559 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51b8af4a-eb0e-48a7-885d-917c60d526d3" path="/var/lib/kubelet/pods/51b8af4a-eb0e-48a7-885d-917c60d526d3/volumes" Sep 30 08:01:37 crc kubenswrapper[4760]: I0930 08:01:37.082262 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="527f2de8-9fbb-4ecb-893d-ffe0db4d524b" path="/var/lib/kubelet/pods/527f2de8-9fbb-4ecb-893d-ffe0db4d524b/volumes" Sep 30 08:01:37 crc kubenswrapper[4760]: I0930 08:01:37.082941 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3eab693-7a8a-4ad2-a247-b8a79f178a87" path="/var/lib/kubelet/pods/c3eab693-7a8a-4ad2-a247-b8a79f178a87/volumes" Sep 30 08:01:38 crc kubenswrapper[4760]: I0930 08:01:38.066544 4760 scope.go:117] "RemoveContainer" containerID="6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" Sep 30 08:01:38 crc kubenswrapper[4760]: E0930 08:01:38.067237 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:01:42 crc kubenswrapper[4760]: I0930 08:01:42.714875 4760 generic.go:334] "Generic (PLEG): container finished" podID="e2439cde-d5f2-423a-9e6d-4af8d713c917" containerID="053b3da9332cd0a7db9877665f1249d8ae420645eb3fbaa9d4e51c6a822d3d1f" exitCode=0 Sep 30 08:01:42 crc kubenswrapper[4760]: I0930 08:01:42.714953 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4" event={"ID":"e2439cde-d5f2-423a-9e6d-4af8d713c917","Type":"ContainerDied","Data":"053b3da9332cd0a7db9877665f1249d8ae420645eb3fbaa9d4e51c6a822d3d1f"} Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.152349 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.298089 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2439cde-d5f2-423a-9e6d-4af8d713c917-ssh-key\") pod \"e2439cde-d5f2-423a-9e6d-4af8d713c917\" (UID: \"e2439cde-d5f2-423a-9e6d-4af8d713c917\") " Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.298718 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2439cde-d5f2-423a-9e6d-4af8d713c917-inventory\") pod \"e2439cde-d5f2-423a-9e6d-4af8d713c917\" (UID: \"e2439cde-d5f2-423a-9e6d-4af8d713c917\") " Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.298785 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl4rh\" (UniqueName: \"kubernetes.io/projected/e2439cde-d5f2-423a-9e6d-4af8d713c917-kube-api-access-xl4rh\") pod \"e2439cde-d5f2-423a-9e6d-4af8d713c917\" (UID: \"e2439cde-d5f2-423a-9e6d-4af8d713c917\") " Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.307549 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2439cde-d5f2-423a-9e6d-4af8d713c917-kube-api-access-xl4rh" (OuterVolumeSpecName: "kube-api-access-xl4rh") pod "e2439cde-d5f2-423a-9e6d-4af8d713c917" (UID: "e2439cde-d5f2-423a-9e6d-4af8d713c917"). InnerVolumeSpecName "kube-api-access-xl4rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.329857 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2439cde-d5f2-423a-9e6d-4af8d713c917-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e2439cde-d5f2-423a-9e6d-4af8d713c917" (UID: "e2439cde-d5f2-423a-9e6d-4af8d713c917"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.333190 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2439cde-d5f2-423a-9e6d-4af8d713c917-inventory" (OuterVolumeSpecName: "inventory") pod "e2439cde-d5f2-423a-9e6d-4af8d713c917" (UID: "e2439cde-d5f2-423a-9e6d-4af8d713c917"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.400845 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2439cde-d5f2-423a-9e6d-4af8d713c917-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.400877 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl4rh\" (UniqueName: \"kubernetes.io/projected/e2439cde-d5f2-423a-9e6d-4af8d713c917-kube-api-access-xl4rh\") on node \"crc\" DevicePath \"\"" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.400889 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2439cde-d5f2-423a-9e6d-4af8d713c917-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.713789 4760 scope.go:117] "RemoveContainer" containerID="53b92e2949ce33a9d4ae27e2e663dbf94e93e32a5e849263f5e67fb352e44bbc" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.738007 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4" event={"ID":"e2439cde-d5f2-423a-9e6d-4af8d713c917","Type":"ContainerDied","Data":"4aeb358827626f643a820bd69738dc1af465430bfbb873603c873fbd04bb35d1"} Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.738292 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4aeb358827626f643a820bd69738dc1af465430bfbb873603c873fbd04bb35d1" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.738137 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.749152 4760 scope.go:117] "RemoveContainer" containerID="e7d14e2491ad5b89419c6c96eded977f0df2ff4374b049abb04dfed938177ab6" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.837924 4760 scope.go:117] "RemoveContainer" containerID="c17a111d0562f8972f7218ff8da22c761885ce0d02658fc80f242203a4de9887" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.845985 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r"] Sep 30 08:01:44 crc kubenswrapper[4760]: E0930 08:01:44.846435 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a6d4144-48a7-412d-9288-a909f1fbd5f4" containerName="keystone-cron" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.846454 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a6d4144-48a7-412d-9288-a909f1fbd5f4" containerName="keystone-cron" Sep 30 08:01:44 crc kubenswrapper[4760]: E0930 08:01:44.846469 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af6d8e2-56f6-47ec-9ee3-fe00eeb022df" containerName="registry-server" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.846475 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af6d8e2-56f6-47ec-9ee3-fe00eeb022df" containerName="registry-server" Sep 30 08:01:44 crc kubenswrapper[4760]: E0930 08:01:44.846503 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af6d8e2-56f6-47ec-9ee3-fe00eeb022df" containerName="extract-utilities" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.846510 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af6d8e2-56f6-47ec-9ee3-fe00eeb022df" containerName="extract-utilities" Sep 30 08:01:44 crc kubenswrapper[4760]: E0930 08:01:44.846528 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2439cde-d5f2-423a-9e6d-4af8d713c917" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.846534 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2439cde-d5f2-423a-9e6d-4af8d713c917" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 08:01:44 crc kubenswrapper[4760]: E0930 08:01:44.846548 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af6d8e2-56f6-47ec-9ee3-fe00eeb022df" containerName="extract-content" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.846554 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af6d8e2-56f6-47ec-9ee3-fe00eeb022df" containerName="extract-content" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.846733 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a6d4144-48a7-412d-9288-a909f1fbd5f4" containerName="keystone-cron" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.846759 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2af6d8e2-56f6-47ec-9ee3-fe00eeb022df" containerName="registry-server" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.846769 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2439cde-d5f2-423a-9e6d-4af8d713c917" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.847451 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.851110 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.851478 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.851771 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.852347 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8gxrf" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.856205 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r"] Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.864162 4760 scope.go:117] "RemoveContainer" containerID="29a0812669f375c2db99afff03027c85b88c05aa1042ba5f829e696f3f9475e1" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.904905 4760 scope.go:117] "RemoveContainer" containerID="b441e987ef7decf049664c413470ea11bd7c6b9be8c56175022120bb2f98ef6f" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.929004 4760 scope.go:117] "RemoveContainer" containerID="9b222ed2e2b1d8cfd1367adce834c59a588acaccb02e0ea0ed88341371a25f26" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.947875 4760 scope.go:117] "RemoveContainer" containerID="34df7a6fed2b13c4f77c2412b04d3044f9f81e01a69fa17b070335180992cc7c" Sep 30 08:01:44 crc kubenswrapper[4760]: I0930 08:01:44.983167 4760 scope.go:117] "RemoveContainer" containerID="da79e51b47f98778727a550fc3e1167b95d280a9cb76165601bb82b5c73cf3f9" Sep 30 08:01:45 crc kubenswrapper[4760]: I0930 08:01:45.010438 4760 scope.go:117] "RemoveContainer" containerID="591c2b67bc37bc99a75d672749e39aa0d92bab52d997143c9e18ccae07c5f4cc" Sep 30 08:01:45 crc kubenswrapper[4760]: I0930 08:01:45.013861 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np87j\" (UniqueName: \"kubernetes.io/projected/daecc10f-5930-44cc-806b-95012b47df8a-kube-api-access-np87j\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r\" (UID: \"daecc10f-5930-44cc-806b-95012b47df8a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r" Sep 30 08:01:45 crc kubenswrapper[4760]: I0930 08:01:45.013903 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/daecc10f-5930-44cc-806b-95012b47df8a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r\" (UID: \"daecc10f-5930-44cc-806b-95012b47df8a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r" Sep 30 08:01:45 crc kubenswrapper[4760]: I0930 08:01:45.013943 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/daecc10f-5930-44cc-806b-95012b47df8a-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r\" (UID: \"daecc10f-5930-44cc-806b-95012b47df8a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r" Sep 30 08:01:45 crc kubenswrapper[4760]: I0930 08:01:45.116414 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np87j\" (UniqueName: \"kubernetes.io/projected/daecc10f-5930-44cc-806b-95012b47df8a-kube-api-access-np87j\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r\" (UID: \"daecc10f-5930-44cc-806b-95012b47df8a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r" Sep 30 08:01:45 crc kubenswrapper[4760]: I0930 08:01:45.116897 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/daecc10f-5930-44cc-806b-95012b47df8a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r\" (UID: \"daecc10f-5930-44cc-806b-95012b47df8a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r" Sep 30 08:01:45 crc kubenswrapper[4760]: I0930 08:01:45.117001 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/daecc10f-5930-44cc-806b-95012b47df8a-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r\" (UID: \"daecc10f-5930-44cc-806b-95012b47df8a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r" Sep 30 08:01:45 crc kubenswrapper[4760]: I0930 08:01:45.121728 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/daecc10f-5930-44cc-806b-95012b47df8a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r\" (UID: \"daecc10f-5930-44cc-806b-95012b47df8a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r" Sep 30 08:01:45 crc kubenswrapper[4760]: I0930 08:01:45.121734 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/daecc10f-5930-44cc-806b-95012b47df8a-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r\" (UID: \"daecc10f-5930-44cc-806b-95012b47df8a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r" Sep 30 08:01:45 crc kubenswrapper[4760]: I0930 08:01:45.137834 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np87j\" (UniqueName: \"kubernetes.io/projected/daecc10f-5930-44cc-806b-95012b47df8a-kube-api-access-np87j\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r\" (UID: \"daecc10f-5930-44cc-806b-95012b47df8a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r" Sep 30 08:01:45 crc kubenswrapper[4760]: I0930 08:01:45.174570 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r" Sep 30 08:01:45 crc kubenswrapper[4760]: I0930 08:01:45.703941 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r"] Sep 30 08:01:45 crc kubenswrapper[4760]: W0930 08:01:45.718957 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddaecc10f_5930_44cc_806b_95012b47df8a.slice/crio-ccc713785c25fb2416c44191a360ec9cf3ef8e24fb53ef78d8fb3d3896bade83 WatchSource:0}: Error finding container ccc713785c25fb2416c44191a360ec9cf3ef8e24fb53ef78d8fb3d3896bade83: Status 404 returned error can't find the container with id ccc713785c25fb2416c44191a360ec9cf3ef8e24fb53ef78d8fb3d3896bade83 Sep 30 08:01:45 crc kubenswrapper[4760]: I0930 08:01:45.750256 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r" event={"ID":"daecc10f-5930-44cc-806b-95012b47df8a","Type":"ContainerStarted","Data":"ccc713785c25fb2416c44191a360ec9cf3ef8e24fb53ef78d8fb3d3896bade83"} Sep 30 08:01:46 crc kubenswrapper[4760]: I0930 08:01:46.767064 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r" event={"ID":"daecc10f-5930-44cc-806b-95012b47df8a","Type":"ContainerStarted","Data":"a6b19f3617e1049dc6bd6bccc38a0c643dfd5f829596d079c8b2626844f48c40"} Sep 30 08:01:46 crc kubenswrapper[4760]: I0930 08:01:46.795258 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r" podStartSLOduration=2.10089829 podStartE2EDuration="2.795236872s" podCreationTimestamp="2025-09-30 08:01:44 +0000 UTC" firstStartedPulling="2025-09-30 08:01:45.722452224 +0000 UTC m=+1691.365358636" lastFinishedPulling="2025-09-30 08:01:46.416790766 +0000 UTC m=+1692.059697218" observedRunningTime="2025-09-30 08:01:46.784659411 +0000 UTC m=+1692.427565883" watchObservedRunningTime="2025-09-30 08:01:46.795236872 +0000 UTC m=+1692.438143304" Sep 30 08:01:50 crc kubenswrapper[4760]: I0930 08:01:50.066896 4760 scope.go:117] "RemoveContainer" containerID="6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" Sep 30 08:01:50 crc kubenswrapper[4760]: E0930 08:01:50.067800 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:01:52 crc kubenswrapper[4760]: I0930 08:01:52.821851 4760 generic.go:334] "Generic (PLEG): container finished" podID="daecc10f-5930-44cc-806b-95012b47df8a" containerID="a6b19f3617e1049dc6bd6bccc38a0c643dfd5f829596d079c8b2626844f48c40" exitCode=0 Sep 30 08:01:52 crc kubenswrapper[4760]: I0930 08:01:52.821961 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r" event={"ID":"daecc10f-5930-44cc-806b-95012b47df8a","Type":"ContainerDied","Data":"a6b19f3617e1049dc6bd6bccc38a0c643dfd5f829596d079c8b2626844f48c40"} Sep 30 08:01:54 crc kubenswrapper[4760]: I0930 08:01:54.054189 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-7b12-account-create-59swt"] Sep 30 08:01:54 crc kubenswrapper[4760]: I0930 08:01:54.078718 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-7b12-account-create-59swt"] Sep 30 08:01:54 crc kubenswrapper[4760]: I0930 08:01:54.095015 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-8b87-account-create-gpd5x"] Sep 30 08:01:54 crc kubenswrapper[4760]: I0930 08:01:54.103458 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-8b87-account-create-gpd5x"] Sep 30 08:01:54 crc kubenswrapper[4760]: I0930 08:01:54.248404 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r" Sep 30 08:01:54 crc kubenswrapper[4760]: I0930 08:01:54.320031 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/daecc10f-5930-44cc-806b-95012b47df8a-ssh-key\") pod \"daecc10f-5930-44cc-806b-95012b47df8a\" (UID: \"daecc10f-5930-44cc-806b-95012b47df8a\") " Sep 30 08:01:54 crc kubenswrapper[4760]: I0930 08:01:54.320218 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np87j\" (UniqueName: \"kubernetes.io/projected/daecc10f-5930-44cc-806b-95012b47df8a-kube-api-access-np87j\") pod \"daecc10f-5930-44cc-806b-95012b47df8a\" (UID: \"daecc10f-5930-44cc-806b-95012b47df8a\") " Sep 30 08:01:54 crc kubenswrapper[4760]: I0930 08:01:54.320906 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/daecc10f-5930-44cc-806b-95012b47df8a-inventory\") pod \"daecc10f-5930-44cc-806b-95012b47df8a\" (UID: \"daecc10f-5930-44cc-806b-95012b47df8a\") " Sep 30 08:01:54 crc kubenswrapper[4760]: I0930 08:01:54.330730 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daecc10f-5930-44cc-806b-95012b47df8a-kube-api-access-np87j" (OuterVolumeSpecName: "kube-api-access-np87j") pod "daecc10f-5930-44cc-806b-95012b47df8a" (UID: "daecc10f-5930-44cc-806b-95012b47df8a"). InnerVolumeSpecName "kube-api-access-np87j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:01:54 crc kubenswrapper[4760]: I0930 08:01:54.347832 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daecc10f-5930-44cc-806b-95012b47df8a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "daecc10f-5930-44cc-806b-95012b47df8a" (UID: "daecc10f-5930-44cc-806b-95012b47df8a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:01:54 crc kubenswrapper[4760]: I0930 08:01:54.350043 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daecc10f-5930-44cc-806b-95012b47df8a-inventory" (OuterVolumeSpecName: "inventory") pod "daecc10f-5930-44cc-806b-95012b47df8a" (UID: "daecc10f-5930-44cc-806b-95012b47df8a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:01:54 crc kubenswrapper[4760]: I0930 08:01:54.423943 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/daecc10f-5930-44cc-806b-95012b47df8a-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 08:01:54 crc kubenswrapper[4760]: I0930 08:01:54.423989 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/daecc10f-5930-44cc-806b-95012b47df8a-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 08:01:54 crc kubenswrapper[4760]: I0930 08:01:54.424030 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np87j\" (UniqueName: \"kubernetes.io/projected/daecc10f-5930-44cc-806b-95012b47df8a-kube-api-access-np87j\") on node \"crc\" DevicePath \"\"" Sep 30 08:01:54 crc kubenswrapper[4760]: I0930 08:01:54.849097 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r" event={"ID":"daecc10f-5930-44cc-806b-95012b47df8a","Type":"ContainerDied","Data":"ccc713785c25fb2416c44191a360ec9cf3ef8e24fb53ef78d8fb3d3896bade83"} Sep 30 08:01:54 crc kubenswrapper[4760]: I0930 08:01:54.849151 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccc713785c25fb2416c44191a360ec9cf3ef8e24fb53ef78d8fb3d3896bade83" Sep 30 08:01:54 crc kubenswrapper[4760]: I0930 08:01:54.849162 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r" Sep 30 08:01:54 crc kubenswrapper[4760]: I0930 08:01:54.941838 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-zxgfv"] Sep 30 08:01:54 crc kubenswrapper[4760]: E0930 08:01:54.942648 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daecc10f-5930-44cc-806b-95012b47df8a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 08:01:54 crc kubenswrapper[4760]: I0930 08:01:54.942670 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="daecc10f-5930-44cc-806b-95012b47df8a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 08:01:54 crc kubenswrapper[4760]: I0930 08:01:54.942901 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="daecc10f-5930-44cc-806b-95012b47df8a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 08:01:54 crc kubenswrapper[4760]: I0930 08:01:54.943654 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zxgfv" Sep 30 08:01:54 crc kubenswrapper[4760]: I0930 08:01:54.947067 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 08:01:54 crc kubenswrapper[4760]: I0930 08:01:54.947428 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 08:01:54 crc kubenswrapper[4760]: I0930 08:01:54.948804 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8gxrf" Sep 30 08:01:54 crc kubenswrapper[4760]: I0930 08:01:54.952281 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-zxgfv"] Sep 30 08:01:54 crc kubenswrapper[4760]: I0930 08:01:54.953463 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 08:01:55 crc kubenswrapper[4760]: I0930 08:01:55.028913 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-7b10-account-create-nhsdp"] Sep 30 08:01:55 crc kubenswrapper[4760]: I0930 08:01:55.034369 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23f502d4-3801-4388-b442-22f60146dcf2-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zxgfv\" (UID: \"23f502d4-3801-4388-b442-22f60146dcf2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zxgfv" Sep 30 08:01:55 crc kubenswrapper[4760]: I0930 08:01:55.034476 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23f502d4-3801-4388-b442-22f60146dcf2-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zxgfv\" (UID: \"23f502d4-3801-4388-b442-22f60146dcf2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zxgfv" Sep 30 08:01:55 crc kubenswrapper[4760]: I0930 08:01:55.034512 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvx9c\" (UniqueName: \"kubernetes.io/projected/23f502d4-3801-4388-b442-22f60146dcf2-kube-api-access-dvx9c\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zxgfv\" (UID: \"23f502d4-3801-4388-b442-22f60146dcf2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zxgfv" Sep 30 08:01:55 crc kubenswrapper[4760]: I0930 08:01:55.037134 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-7b10-account-create-nhsdp"] Sep 30 08:01:55 crc kubenswrapper[4760]: I0930 08:01:55.077660 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb3cbe0b-da3a-48ff-8181-68a3fb8b9db5" path="/var/lib/kubelet/pods/bb3cbe0b-da3a-48ff-8181-68a3fb8b9db5/volumes" Sep 30 08:01:55 crc kubenswrapper[4760]: I0930 08:01:55.078257 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7cb900f-4fea-42ed-a186-c173a16463b6" path="/var/lib/kubelet/pods/d7cb900f-4fea-42ed-a186-c173a16463b6/volumes" Sep 30 08:01:55 crc kubenswrapper[4760]: I0930 08:01:55.078811 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d94a6d3c-5790-45ac-b75e-02cf0defd846" path="/var/lib/kubelet/pods/d94a6d3c-5790-45ac-b75e-02cf0defd846/volumes" Sep 30 08:01:55 crc kubenswrapper[4760]: I0930 08:01:55.135152 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23f502d4-3801-4388-b442-22f60146dcf2-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zxgfv\" (UID: \"23f502d4-3801-4388-b442-22f60146dcf2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zxgfv" Sep 30 08:01:55 crc kubenswrapper[4760]: I0930 08:01:55.135201 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvx9c\" (UniqueName: \"kubernetes.io/projected/23f502d4-3801-4388-b442-22f60146dcf2-kube-api-access-dvx9c\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zxgfv\" (UID: \"23f502d4-3801-4388-b442-22f60146dcf2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zxgfv" Sep 30 08:01:55 crc kubenswrapper[4760]: I0930 08:01:55.135364 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23f502d4-3801-4388-b442-22f60146dcf2-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zxgfv\" (UID: \"23f502d4-3801-4388-b442-22f60146dcf2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zxgfv" Sep 30 08:01:55 crc kubenswrapper[4760]: I0930 08:01:55.139321 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23f502d4-3801-4388-b442-22f60146dcf2-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zxgfv\" (UID: \"23f502d4-3801-4388-b442-22f60146dcf2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zxgfv" Sep 30 08:01:55 crc kubenswrapper[4760]: I0930 08:01:55.140084 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23f502d4-3801-4388-b442-22f60146dcf2-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zxgfv\" (UID: \"23f502d4-3801-4388-b442-22f60146dcf2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zxgfv" Sep 30 08:01:55 crc kubenswrapper[4760]: I0930 08:01:55.154667 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvx9c\" (UniqueName: \"kubernetes.io/projected/23f502d4-3801-4388-b442-22f60146dcf2-kube-api-access-dvx9c\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zxgfv\" (UID: \"23f502d4-3801-4388-b442-22f60146dcf2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zxgfv" Sep 30 08:01:55 crc kubenswrapper[4760]: I0930 08:01:55.272629 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zxgfv" Sep 30 08:01:55 crc kubenswrapper[4760]: I0930 08:01:55.615281 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-zxgfv"] Sep 30 08:01:55 crc kubenswrapper[4760]: I0930 08:01:55.861271 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zxgfv" event={"ID":"23f502d4-3801-4388-b442-22f60146dcf2","Type":"ContainerStarted","Data":"1460f5d7a4e50761de516e01eff25c6c5326a87b71ec8fd2dec984e4de693c22"} Sep 30 08:01:56 crc kubenswrapper[4760]: I0930 08:01:56.879677 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zxgfv" event={"ID":"23f502d4-3801-4388-b442-22f60146dcf2","Type":"ContainerStarted","Data":"5d7cc07c79c2e9279f59aaa41b033d81c60fad66f27b0f001e6a21ae9771a518"} Sep 30 08:01:56 crc kubenswrapper[4760]: I0930 08:01:56.897530 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zxgfv" podStartSLOduration=2.447556321 podStartE2EDuration="2.897515038s" podCreationTimestamp="2025-09-30 08:01:54 +0000 UTC" firstStartedPulling="2025-09-30 08:01:55.625885921 +0000 UTC m=+1701.268792333" lastFinishedPulling="2025-09-30 08:01:56.075844628 +0000 UTC m=+1701.718751050" observedRunningTime="2025-09-30 08:01:56.897032866 +0000 UTC m=+1702.539939278" watchObservedRunningTime="2025-09-30 08:01:56.897515038 +0000 UTC m=+1702.540421440" Sep 30 08:02:01 crc kubenswrapper[4760]: I0930 08:02:01.066808 4760 scope.go:117] "RemoveContainer" containerID="6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" Sep 30 08:02:01 crc kubenswrapper[4760]: E0930 08:02:01.067752 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:02:14 crc kubenswrapper[4760]: I0930 08:02:14.067557 4760 scope.go:117] "RemoveContainer" containerID="6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" Sep 30 08:02:14 crc kubenswrapper[4760]: E0930 08:02:14.068605 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:02:17 crc kubenswrapper[4760]: I0930 08:02:17.085620 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wsmzq"] Sep 30 08:02:17 crc kubenswrapper[4760]: I0930 08:02:17.088823 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wsmzq"] Sep 30 08:02:19 crc kubenswrapper[4760]: I0930 08:02:19.077290 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="152b47cf-da92-44c1-9b68-90cc849f4b74" path="/var/lib/kubelet/pods/152b47cf-da92-44c1-9b68-90cc849f4b74/volumes" Sep 30 08:02:27 crc kubenswrapper[4760]: I0930 08:02:27.067257 4760 scope.go:117] "RemoveContainer" containerID="6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" Sep 30 08:02:27 crc kubenswrapper[4760]: E0930 08:02:27.067858 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:02:39 crc kubenswrapper[4760]: I0930 08:02:39.057256 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-9t7c8"] Sep 30 08:02:39 crc kubenswrapper[4760]: I0930 08:02:39.086595 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-9t7c8"] Sep 30 08:02:39 crc kubenswrapper[4760]: I0930 08:02:39.301963 4760 generic.go:334] "Generic (PLEG): container finished" podID="23f502d4-3801-4388-b442-22f60146dcf2" containerID="5d7cc07c79c2e9279f59aaa41b033d81c60fad66f27b0f001e6a21ae9771a518" exitCode=0 Sep 30 08:02:39 crc kubenswrapper[4760]: I0930 08:02:39.302008 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zxgfv" event={"ID":"23f502d4-3801-4388-b442-22f60146dcf2","Type":"ContainerDied","Data":"5d7cc07c79c2e9279f59aaa41b033d81c60fad66f27b0f001e6a21ae9771a518"} Sep 30 08:02:40 crc kubenswrapper[4760]: I0930 08:02:40.037258 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7jv2q"] Sep 30 08:02:40 crc kubenswrapper[4760]: I0930 08:02:40.048690 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7jv2q"] Sep 30 08:02:40 crc kubenswrapper[4760]: I0930 08:02:40.732326 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zxgfv" Sep 30 08:02:40 crc kubenswrapper[4760]: I0930 08:02:40.861824 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23f502d4-3801-4388-b442-22f60146dcf2-ssh-key\") pod \"23f502d4-3801-4388-b442-22f60146dcf2\" (UID: \"23f502d4-3801-4388-b442-22f60146dcf2\") " Sep 30 08:02:40 crc kubenswrapper[4760]: I0930 08:02:40.861998 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvx9c\" (UniqueName: \"kubernetes.io/projected/23f502d4-3801-4388-b442-22f60146dcf2-kube-api-access-dvx9c\") pod \"23f502d4-3801-4388-b442-22f60146dcf2\" (UID: \"23f502d4-3801-4388-b442-22f60146dcf2\") " Sep 30 08:02:40 crc kubenswrapper[4760]: I0930 08:02:40.862110 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23f502d4-3801-4388-b442-22f60146dcf2-inventory\") pod \"23f502d4-3801-4388-b442-22f60146dcf2\" (UID: \"23f502d4-3801-4388-b442-22f60146dcf2\") " Sep 30 08:02:40 crc kubenswrapper[4760]: I0930 08:02:40.869649 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23f502d4-3801-4388-b442-22f60146dcf2-kube-api-access-dvx9c" (OuterVolumeSpecName: "kube-api-access-dvx9c") pod "23f502d4-3801-4388-b442-22f60146dcf2" (UID: "23f502d4-3801-4388-b442-22f60146dcf2"). InnerVolumeSpecName "kube-api-access-dvx9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:02:40 crc kubenswrapper[4760]: I0930 08:02:40.894790 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23f502d4-3801-4388-b442-22f60146dcf2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "23f502d4-3801-4388-b442-22f60146dcf2" (UID: "23f502d4-3801-4388-b442-22f60146dcf2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:02:40 crc kubenswrapper[4760]: I0930 08:02:40.900827 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23f502d4-3801-4388-b442-22f60146dcf2-inventory" (OuterVolumeSpecName: "inventory") pod "23f502d4-3801-4388-b442-22f60146dcf2" (UID: "23f502d4-3801-4388-b442-22f60146dcf2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:02:40 crc kubenswrapper[4760]: I0930 08:02:40.964676 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvx9c\" (UniqueName: \"kubernetes.io/projected/23f502d4-3801-4388-b442-22f60146dcf2-kube-api-access-dvx9c\") on node \"crc\" DevicePath \"\"" Sep 30 08:02:40 crc kubenswrapper[4760]: I0930 08:02:40.964726 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23f502d4-3801-4388-b442-22f60146dcf2-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 08:02:40 crc kubenswrapper[4760]: I0930 08:02:40.964739 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23f502d4-3801-4388-b442-22f60146dcf2-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 08:02:41 crc kubenswrapper[4760]: I0930 08:02:41.068243 4760 scope.go:117] "RemoveContainer" containerID="6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" Sep 30 08:02:41 crc kubenswrapper[4760]: E0930 08:02:41.068910 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:02:41 crc kubenswrapper[4760]: I0930 08:02:41.082210 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="462eb909-9878-48ca-a8fc-93b0e6b6d7f7" path="/var/lib/kubelet/pods/462eb909-9878-48ca-a8fc-93b0e6b6d7f7/volumes" Sep 30 08:02:41 crc kubenswrapper[4760]: I0930 08:02:41.083249 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c34c9e7c-e366-4179-ab98-c01d5b2cfc3d" path="/var/lib/kubelet/pods/c34c9e7c-e366-4179-ab98-c01d5b2cfc3d/volumes" Sep 30 08:02:41 crc kubenswrapper[4760]: I0930 08:02:41.326467 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zxgfv" event={"ID":"23f502d4-3801-4388-b442-22f60146dcf2","Type":"ContainerDied","Data":"1460f5d7a4e50761de516e01eff25c6c5326a87b71ec8fd2dec984e4de693c22"} Sep 30 08:02:41 crc kubenswrapper[4760]: I0930 08:02:41.326899 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1460f5d7a4e50761de516e01eff25c6c5326a87b71ec8fd2dec984e4de693c22" Sep 30 08:02:41 crc kubenswrapper[4760]: I0930 08:02:41.326536 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zxgfv" Sep 30 08:02:41 crc kubenswrapper[4760]: I0930 08:02:41.442729 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v"] Sep 30 08:02:41 crc kubenswrapper[4760]: E0930 08:02:41.443660 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23f502d4-3801-4388-b442-22f60146dcf2" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 08:02:41 crc kubenswrapper[4760]: I0930 08:02:41.443708 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="23f502d4-3801-4388-b442-22f60146dcf2" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 08:02:41 crc kubenswrapper[4760]: I0930 08:02:41.444201 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="23f502d4-3801-4388-b442-22f60146dcf2" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 08:02:41 crc kubenswrapper[4760]: I0930 08:02:41.445787 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v" Sep 30 08:02:41 crc kubenswrapper[4760]: I0930 08:02:41.455835 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v"] Sep 30 08:02:41 crc kubenswrapper[4760]: I0930 08:02:41.461264 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8gxrf" Sep 30 08:02:41 crc kubenswrapper[4760]: I0930 08:02:41.461821 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 08:02:41 crc kubenswrapper[4760]: I0930 08:02:41.462139 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 08:02:41 crc kubenswrapper[4760]: I0930 08:02:41.462197 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 08:02:41 crc kubenswrapper[4760]: I0930 08:02:41.577565 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc1682c5-7e4d-43a1-89f4-b40761683742-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v\" (UID: \"fc1682c5-7e4d-43a1-89f4-b40761683742\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v" Sep 30 08:02:41 crc kubenswrapper[4760]: I0930 08:02:41.577631 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb6xs\" (UniqueName: \"kubernetes.io/projected/fc1682c5-7e4d-43a1-89f4-b40761683742-kube-api-access-xb6xs\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v\" (UID: \"fc1682c5-7e4d-43a1-89f4-b40761683742\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v" Sep 30 08:02:41 crc kubenswrapper[4760]: I0930 08:02:41.577822 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc1682c5-7e4d-43a1-89f4-b40761683742-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v\" (UID: \"fc1682c5-7e4d-43a1-89f4-b40761683742\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v" Sep 30 08:02:41 crc kubenswrapper[4760]: I0930 08:02:41.679458 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc1682c5-7e4d-43a1-89f4-b40761683742-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v\" (UID: \"fc1682c5-7e4d-43a1-89f4-b40761683742\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v" Sep 30 08:02:41 crc kubenswrapper[4760]: I0930 08:02:41.679566 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc1682c5-7e4d-43a1-89f4-b40761683742-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v\" (UID: \"fc1682c5-7e4d-43a1-89f4-b40761683742\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v" Sep 30 08:02:41 crc kubenswrapper[4760]: I0930 08:02:41.679617 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb6xs\" (UniqueName: \"kubernetes.io/projected/fc1682c5-7e4d-43a1-89f4-b40761683742-kube-api-access-xb6xs\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v\" (UID: \"fc1682c5-7e4d-43a1-89f4-b40761683742\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v" Sep 30 08:02:41 crc kubenswrapper[4760]: I0930 08:02:41.684014 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc1682c5-7e4d-43a1-89f4-b40761683742-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v\" (UID: \"fc1682c5-7e4d-43a1-89f4-b40761683742\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v" Sep 30 08:02:41 crc kubenswrapper[4760]: I0930 08:02:41.685113 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc1682c5-7e4d-43a1-89f4-b40761683742-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v\" (UID: \"fc1682c5-7e4d-43a1-89f4-b40761683742\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v" Sep 30 08:02:41 crc kubenswrapper[4760]: I0930 08:02:41.707633 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb6xs\" (UniqueName: \"kubernetes.io/projected/fc1682c5-7e4d-43a1-89f4-b40761683742-kube-api-access-xb6xs\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v\" (UID: \"fc1682c5-7e4d-43a1-89f4-b40761683742\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v" Sep 30 08:02:41 crc kubenswrapper[4760]: I0930 08:02:41.774118 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v" Sep 30 08:02:42 crc kubenswrapper[4760]: I0930 08:02:42.318345 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v"] Sep 30 08:02:42 crc kubenswrapper[4760]: W0930 08:02:42.322708 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc1682c5_7e4d_43a1_89f4_b40761683742.slice/crio-daa54067443f1cf342a1f3bda8e509f378c6b942cf40bab32d6a0ab5d3bea044 WatchSource:0}: Error finding container daa54067443f1cf342a1f3bda8e509f378c6b942cf40bab32d6a0ab5d3bea044: Status 404 returned error can't find the container with id daa54067443f1cf342a1f3bda8e509f378c6b942cf40bab32d6a0ab5d3bea044 Sep 30 08:02:42 crc kubenswrapper[4760]: I0930 08:02:42.336208 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v" event={"ID":"fc1682c5-7e4d-43a1-89f4-b40761683742","Type":"ContainerStarted","Data":"daa54067443f1cf342a1f3bda8e509f378c6b942cf40bab32d6a0ab5d3bea044"} Sep 30 08:02:43 crc kubenswrapper[4760]: I0930 08:02:43.348573 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v" event={"ID":"fc1682c5-7e4d-43a1-89f4-b40761683742","Type":"ContainerStarted","Data":"7752182d277fab7961e822a4f8cb1b74026340c56344fce35bb86fb236ec7f83"} Sep 30 08:02:43 crc kubenswrapper[4760]: I0930 08:02:43.378911 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v" podStartSLOduration=1.825120311 podStartE2EDuration="2.378895195s" podCreationTimestamp="2025-09-30 08:02:41 +0000 UTC" firstStartedPulling="2025-09-30 08:02:42.325252277 +0000 UTC m=+1747.968158709" lastFinishedPulling="2025-09-30 08:02:42.879027131 +0000 UTC m=+1748.521933593" observedRunningTime="2025-09-30 08:02:43.370836849 +0000 UTC m=+1749.013743261" watchObservedRunningTime="2025-09-30 08:02:43.378895195 +0000 UTC m=+1749.021801607" Sep 30 08:02:45 crc kubenswrapper[4760]: I0930 08:02:45.165496 4760 scope.go:117] "RemoveContainer" containerID="478b02568ea9ae9e32dccd85f91feb1777b134982af0cc2b37f6ebb0703477c8" Sep 30 08:02:45 crc kubenswrapper[4760]: I0930 08:02:45.207961 4760 scope.go:117] "RemoveContainer" containerID="b97542524e5f69ec62652adadc07d21dc16b82f150191c1c3d5136520b73aba7" Sep 30 08:02:45 crc kubenswrapper[4760]: I0930 08:02:45.247425 4760 scope.go:117] "RemoveContainer" containerID="ec711e8c3a7507a28e3a6d934552ff9831df2b9170f1ca7e50df57208ddf6955" Sep 30 08:02:45 crc kubenswrapper[4760]: I0930 08:02:45.280484 4760 scope.go:117] "RemoveContainer" containerID="2f93dcb2a5671b6b0fc5400122ea9d2c23b57f214cd9ec9ae187a377b8d40206" Sep 30 08:02:45 crc kubenswrapper[4760]: I0930 08:02:45.326387 4760 scope.go:117] "RemoveContainer" containerID="1d1c275a662b02cb610bdf134f74cd28cfc1debbed525b5ec2ee60f0662b2604" Sep 30 08:02:45 crc kubenswrapper[4760]: I0930 08:02:45.386275 4760 scope.go:117] "RemoveContainer" containerID="4e75feedf602ccc119e27c65423fb9dd78adee09529c1a10d43048a36aa5ea6f" Sep 30 08:02:52 crc kubenswrapper[4760]: I0930 08:02:52.067807 4760 scope.go:117] "RemoveContainer" containerID="6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" Sep 30 08:02:52 crc kubenswrapper[4760]: E0930 08:02:52.068699 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:03:06 crc kubenswrapper[4760]: I0930 08:03:06.066862 4760 scope.go:117] "RemoveContainer" containerID="6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" Sep 30 08:03:06 crc kubenswrapper[4760]: E0930 08:03:06.069017 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:03:18 crc kubenswrapper[4760]: I0930 08:03:18.067960 4760 scope.go:117] "RemoveContainer" containerID="6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" Sep 30 08:03:18 crc kubenswrapper[4760]: E0930 08:03:18.068795 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:03:25 crc kubenswrapper[4760]: I0930 08:03:25.039464 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-blsm9"] Sep 30 08:03:25 crc kubenswrapper[4760]: I0930 08:03:25.049476 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-blsm9"] Sep 30 08:03:25 crc kubenswrapper[4760]: I0930 08:03:25.079362 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb3756b4-ea8a-41f4-a8db-351088780965" path="/var/lib/kubelet/pods/cb3756b4-ea8a-41f4-a8db-351088780965/volumes" Sep 30 08:03:33 crc kubenswrapper[4760]: I0930 08:03:33.067268 4760 scope.go:117] "RemoveContainer" containerID="6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" Sep 30 08:03:33 crc kubenswrapper[4760]: I0930 08:03:33.898272 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerStarted","Data":"96fa7f3155734522f8d82f257a6244d3c4526c33face1ad5feeb1f8274d4d3e8"} Sep 30 08:03:38 crc kubenswrapper[4760]: I0930 08:03:38.948086 4760 generic.go:334] "Generic (PLEG): container finished" podID="fc1682c5-7e4d-43a1-89f4-b40761683742" containerID="7752182d277fab7961e822a4f8cb1b74026340c56344fce35bb86fb236ec7f83" exitCode=0 Sep 30 08:03:38 crc kubenswrapper[4760]: I0930 08:03:38.948497 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v" event={"ID":"fc1682c5-7e4d-43a1-89f4-b40761683742","Type":"ContainerDied","Data":"7752182d277fab7961e822a4f8cb1b74026340c56344fce35bb86fb236ec7f83"} Sep 30 08:03:40 crc kubenswrapper[4760]: I0930 08:03:40.350552 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v" Sep 30 08:03:40 crc kubenswrapper[4760]: I0930 08:03:40.498009 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb6xs\" (UniqueName: \"kubernetes.io/projected/fc1682c5-7e4d-43a1-89f4-b40761683742-kube-api-access-xb6xs\") pod \"fc1682c5-7e4d-43a1-89f4-b40761683742\" (UID: \"fc1682c5-7e4d-43a1-89f4-b40761683742\") " Sep 30 08:03:40 crc kubenswrapper[4760]: I0930 08:03:40.498688 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc1682c5-7e4d-43a1-89f4-b40761683742-inventory\") pod \"fc1682c5-7e4d-43a1-89f4-b40761683742\" (UID: \"fc1682c5-7e4d-43a1-89f4-b40761683742\") " Sep 30 08:03:40 crc kubenswrapper[4760]: I0930 08:03:40.498768 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc1682c5-7e4d-43a1-89f4-b40761683742-ssh-key\") pod \"fc1682c5-7e4d-43a1-89f4-b40761683742\" (UID: \"fc1682c5-7e4d-43a1-89f4-b40761683742\") " Sep 30 08:03:40 crc kubenswrapper[4760]: I0930 08:03:40.523514 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc1682c5-7e4d-43a1-89f4-b40761683742-kube-api-access-xb6xs" (OuterVolumeSpecName: "kube-api-access-xb6xs") pod "fc1682c5-7e4d-43a1-89f4-b40761683742" (UID: "fc1682c5-7e4d-43a1-89f4-b40761683742"). InnerVolumeSpecName "kube-api-access-xb6xs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:03:40 crc kubenswrapper[4760]: I0930 08:03:40.556460 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc1682c5-7e4d-43a1-89f4-b40761683742-inventory" (OuterVolumeSpecName: "inventory") pod "fc1682c5-7e4d-43a1-89f4-b40761683742" (UID: "fc1682c5-7e4d-43a1-89f4-b40761683742"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:03:40 crc kubenswrapper[4760]: I0930 08:03:40.601607 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb6xs\" (UniqueName: \"kubernetes.io/projected/fc1682c5-7e4d-43a1-89f4-b40761683742-kube-api-access-xb6xs\") on node \"crc\" DevicePath \"\"" Sep 30 08:03:40 crc kubenswrapper[4760]: I0930 08:03:40.601646 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc1682c5-7e4d-43a1-89f4-b40761683742-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 08:03:40 crc kubenswrapper[4760]: I0930 08:03:40.606577 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc1682c5-7e4d-43a1-89f4-b40761683742-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fc1682c5-7e4d-43a1-89f4-b40761683742" (UID: "fc1682c5-7e4d-43a1-89f4-b40761683742"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:03:40 crc kubenswrapper[4760]: I0930 08:03:40.703934 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc1682c5-7e4d-43a1-89f4-b40761683742-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 08:03:40 crc kubenswrapper[4760]: I0930 08:03:40.965608 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v" event={"ID":"fc1682c5-7e4d-43a1-89f4-b40761683742","Type":"ContainerDied","Data":"daa54067443f1cf342a1f3bda8e509f378c6b942cf40bab32d6a0ab5d3bea044"} Sep 30 08:03:40 crc kubenswrapper[4760]: I0930 08:03:40.965646 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="daa54067443f1cf342a1f3bda8e509f378c6b942cf40bab32d6a0ab5d3bea044" Sep 30 08:03:40 crc kubenswrapper[4760]: I0930 08:03:40.965667 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v" Sep 30 08:03:41 crc kubenswrapper[4760]: I0930 08:03:41.057663 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-m7lks"] Sep 30 08:03:41 crc kubenswrapper[4760]: E0930 08:03:41.058092 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc1682c5-7e4d-43a1-89f4-b40761683742" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 08:03:41 crc kubenswrapper[4760]: I0930 08:03:41.058109 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc1682c5-7e4d-43a1-89f4-b40761683742" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 08:03:41 crc kubenswrapper[4760]: I0930 08:03:41.058385 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc1682c5-7e4d-43a1-89f4-b40761683742" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 08:03:41 crc kubenswrapper[4760]: I0930 08:03:41.059237 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-m7lks" Sep 30 08:03:41 crc kubenswrapper[4760]: I0930 08:03:41.063168 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 08:03:41 crc kubenswrapper[4760]: I0930 08:03:41.063497 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 08:03:41 crc kubenswrapper[4760]: I0930 08:03:41.063611 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8gxrf" Sep 30 08:03:41 crc kubenswrapper[4760]: I0930 08:03:41.063714 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 08:03:41 crc kubenswrapper[4760]: I0930 08:03:41.065043 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-m7lks"] Sep 30 08:03:41 crc kubenswrapper[4760]: I0930 08:03:41.214169 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/00487f96-583f-4ae8-bd0d-7fb932d86feb-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-m7lks\" (UID: \"00487f96-583f-4ae8-bd0d-7fb932d86feb\") " pod="openstack/ssh-known-hosts-edpm-deployment-m7lks" Sep 30 08:03:41 crc kubenswrapper[4760]: I0930 08:03:41.214370 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/00487f96-583f-4ae8-bd0d-7fb932d86feb-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-m7lks\" (UID: \"00487f96-583f-4ae8-bd0d-7fb932d86feb\") " pod="openstack/ssh-known-hosts-edpm-deployment-m7lks" Sep 30 08:03:41 crc kubenswrapper[4760]: I0930 08:03:41.214505 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmhbw\" (UniqueName: \"kubernetes.io/projected/00487f96-583f-4ae8-bd0d-7fb932d86feb-kube-api-access-hmhbw\") pod \"ssh-known-hosts-edpm-deployment-m7lks\" (UID: \"00487f96-583f-4ae8-bd0d-7fb932d86feb\") " pod="openstack/ssh-known-hosts-edpm-deployment-m7lks" Sep 30 08:03:41 crc kubenswrapper[4760]: I0930 08:03:41.315894 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/00487f96-583f-4ae8-bd0d-7fb932d86feb-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-m7lks\" (UID: \"00487f96-583f-4ae8-bd0d-7fb932d86feb\") " pod="openstack/ssh-known-hosts-edpm-deployment-m7lks" Sep 30 08:03:41 crc kubenswrapper[4760]: I0930 08:03:41.315990 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmhbw\" (UniqueName: \"kubernetes.io/projected/00487f96-583f-4ae8-bd0d-7fb932d86feb-kube-api-access-hmhbw\") pod \"ssh-known-hosts-edpm-deployment-m7lks\" (UID: \"00487f96-583f-4ae8-bd0d-7fb932d86feb\") " pod="openstack/ssh-known-hosts-edpm-deployment-m7lks" Sep 30 08:03:41 crc kubenswrapper[4760]: I0930 08:03:41.316142 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/00487f96-583f-4ae8-bd0d-7fb932d86feb-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-m7lks\" (UID: \"00487f96-583f-4ae8-bd0d-7fb932d86feb\") " pod="openstack/ssh-known-hosts-edpm-deployment-m7lks" Sep 30 08:03:41 crc kubenswrapper[4760]: I0930 08:03:41.320009 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/00487f96-583f-4ae8-bd0d-7fb932d86feb-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-m7lks\" (UID: \"00487f96-583f-4ae8-bd0d-7fb932d86feb\") " pod="openstack/ssh-known-hosts-edpm-deployment-m7lks" Sep 30 08:03:41 crc kubenswrapper[4760]: I0930 08:03:41.320417 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/00487f96-583f-4ae8-bd0d-7fb932d86feb-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-m7lks\" (UID: \"00487f96-583f-4ae8-bd0d-7fb932d86feb\") " pod="openstack/ssh-known-hosts-edpm-deployment-m7lks" Sep 30 08:03:41 crc kubenswrapper[4760]: I0930 08:03:41.337072 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmhbw\" (UniqueName: \"kubernetes.io/projected/00487f96-583f-4ae8-bd0d-7fb932d86feb-kube-api-access-hmhbw\") pod \"ssh-known-hosts-edpm-deployment-m7lks\" (UID: \"00487f96-583f-4ae8-bd0d-7fb932d86feb\") " pod="openstack/ssh-known-hosts-edpm-deployment-m7lks" Sep 30 08:03:41 crc kubenswrapper[4760]: I0930 08:03:41.385971 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-m7lks" Sep 30 08:03:41 crc kubenswrapper[4760]: I0930 08:03:41.985265 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-m7lks"] Sep 30 08:03:41 crc kubenswrapper[4760]: W0930 08:03:41.986464 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00487f96_583f_4ae8_bd0d_7fb932d86feb.slice/crio-50e8d7f8f785630a27e9261177b3e52e8651659edcd89a6dd7f66ee4a5dcf283 WatchSource:0}: Error finding container 50e8d7f8f785630a27e9261177b3e52e8651659edcd89a6dd7f66ee4a5dcf283: Status 404 returned error can't find the container with id 50e8d7f8f785630a27e9261177b3e52e8651659edcd89a6dd7f66ee4a5dcf283 Sep 30 08:03:43 crc kubenswrapper[4760]: I0930 08:03:43.000774 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-m7lks" event={"ID":"00487f96-583f-4ae8-bd0d-7fb932d86feb","Type":"ContainerStarted","Data":"940444880d30238cf72ee65ca98c81c39201de55ded0446cebf324e49e999aa8"} Sep 30 08:03:43 crc kubenswrapper[4760]: I0930 08:03:43.001552 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-m7lks" event={"ID":"00487f96-583f-4ae8-bd0d-7fb932d86feb","Type":"ContainerStarted","Data":"50e8d7f8f785630a27e9261177b3e52e8651659edcd89a6dd7f66ee4a5dcf283"} Sep 30 08:03:43 crc kubenswrapper[4760]: I0930 08:03:43.030893 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-m7lks" podStartSLOduration=1.605647008 podStartE2EDuration="2.030871309s" podCreationTimestamp="2025-09-30 08:03:41 +0000 UTC" firstStartedPulling="2025-09-30 08:03:41.98846503 +0000 UTC m=+1807.631371432" lastFinishedPulling="2025-09-30 08:03:42.413689321 +0000 UTC m=+1808.056595733" observedRunningTime="2025-09-30 08:03:43.023787528 +0000 UTC m=+1808.666693940" watchObservedRunningTime="2025-09-30 08:03:43.030871309 +0000 UTC m=+1808.673777721" Sep 30 08:03:45 crc kubenswrapper[4760]: I0930 08:03:45.534489 4760 scope.go:117] "RemoveContainer" containerID="2cb9abb39970e150113471fc805983549a28fa77d5535df31719c6613410f635" Sep 30 08:03:51 crc kubenswrapper[4760]: I0930 08:03:51.088444 4760 generic.go:334] "Generic (PLEG): container finished" podID="00487f96-583f-4ae8-bd0d-7fb932d86feb" containerID="940444880d30238cf72ee65ca98c81c39201de55ded0446cebf324e49e999aa8" exitCode=0 Sep 30 08:03:51 crc kubenswrapper[4760]: I0930 08:03:51.088524 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-m7lks" event={"ID":"00487f96-583f-4ae8-bd0d-7fb932d86feb","Type":"ContainerDied","Data":"940444880d30238cf72ee65ca98c81c39201de55ded0446cebf324e49e999aa8"} Sep 30 08:03:52 crc kubenswrapper[4760]: I0930 08:03:52.553613 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-m7lks" Sep 30 08:03:52 crc kubenswrapper[4760]: I0930 08:03:52.686685 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/00487f96-583f-4ae8-bd0d-7fb932d86feb-ssh-key-openstack-edpm-ipam\") pod \"00487f96-583f-4ae8-bd0d-7fb932d86feb\" (UID: \"00487f96-583f-4ae8-bd0d-7fb932d86feb\") " Sep 30 08:03:52 crc kubenswrapper[4760]: I0930 08:03:52.686724 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmhbw\" (UniqueName: \"kubernetes.io/projected/00487f96-583f-4ae8-bd0d-7fb932d86feb-kube-api-access-hmhbw\") pod \"00487f96-583f-4ae8-bd0d-7fb932d86feb\" (UID: \"00487f96-583f-4ae8-bd0d-7fb932d86feb\") " Sep 30 08:03:52 crc kubenswrapper[4760]: I0930 08:03:52.687033 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/00487f96-583f-4ae8-bd0d-7fb932d86feb-inventory-0\") pod \"00487f96-583f-4ae8-bd0d-7fb932d86feb\" (UID: \"00487f96-583f-4ae8-bd0d-7fb932d86feb\") " Sep 30 08:03:52 crc kubenswrapper[4760]: I0930 08:03:52.705556 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00487f96-583f-4ae8-bd0d-7fb932d86feb-kube-api-access-hmhbw" (OuterVolumeSpecName: "kube-api-access-hmhbw") pod "00487f96-583f-4ae8-bd0d-7fb932d86feb" (UID: "00487f96-583f-4ae8-bd0d-7fb932d86feb"). InnerVolumeSpecName "kube-api-access-hmhbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:03:52 crc kubenswrapper[4760]: I0930 08:03:52.736300 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00487f96-583f-4ae8-bd0d-7fb932d86feb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "00487f96-583f-4ae8-bd0d-7fb932d86feb" (UID: "00487f96-583f-4ae8-bd0d-7fb932d86feb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:03:52 crc kubenswrapper[4760]: I0930 08:03:52.757429 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00487f96-583f-4ae8-bd0d-7fb932d86feb-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "00487f96-583f-4ae8-bd0d-7fb932d86feb" (UID: "00487f96-583f-4ae8-bd0d-7fb932d86feb"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:03:52 crc kubenswrapper[4760]: I0930 08:03:52.789427 4760 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/00487f96-583f-4ae8-bd0d-7fb932d86feb-inventory-0\") on node \"crc\" DevicePath \"\"" Sep 30 08:03:52 crc kubenswrapper[4760]: I0930 08:03:52.789475 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/00487f96-583f-4ae8-bd0d-7fb932d86feb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Sep 30 08:03:52 crc kubenswrapper[4760]: I0930 08:03:52.789489 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmhbw\" (UniqueName: \"kubernetes.io/projected/00487f96-583f-4ae8-bd0d-7fb932d86feb-kube-api-access-hmhbw\") on node \"crc\" DevicePath \"\"" Sep 30 08:03:53 crc kubenswrapper[4760]: I0930 08:03:53.108967 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-m7lks" event={"ID":"00487f96-583f-4ae8-bd0d-7fb932d86feb","Type":"ContainerDied","Data":"50e8d7f8f785630a27e9261177b3e52e8651659edcd89a6dd7f66ee4a5dcf283"} Sep 30 08:03:53 crc kubenswrapper[4760]: I0930 08:03:53.109016 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50e8d7f8f785630a27e9261177b3e52e8651659edcd89a6dd7f66ee4a5dcf283" Sep 30 08:03:53 crc kubenswrapper[4760]: I0930 08:03:53.109078 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-m7lks" Sep 30 08:03:53 crc kubenswrapper[4760]: I0930 08:03:53.230651 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-vznk5"] Sep 30 08:03:53 crc kubenswrapper[4760]: E0930 08:03:53.231104 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00487f96-583f-4ae8-bd0d-7fb932d86feb" containerName="ssh-known-hosts-edpm-deployment" Sep 30 08:03:53 crc kubenswrapper[4760]: I0930 08:03:53.231120 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="00487f96-583f-4ae8-bd0d-7fb932d86feb" containerName="ssh-known-hosts-edpm-deployment" Sep 30 08:03:53 crc kubenswrapper[4760]: I0930 08:03:53.231363 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="00487f96-583f-4ae8-bd0d-7fb932d86feb" containerName="ssh-known-hosts-edpm-deployment" Sep 30 08:03:53 crc kubenswrapper[4760]: I0930 08:03:53.232044 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vznk5" Sep 30 08:03:53 crc kubenswrapper[4760]: I0930 08:03:53.234392 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 08:03:53 crc kubenswrapper[4760]: I0930 08:03:53.234447 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 08:03:53 crc kubenswrapper[4760]: I0930 08:03:53.234687 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 08:03:53 crc kubenswrapper[4760]: I0930 08:03:53.238202 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8gxrf" Sep 30 08:03:53 crc kubenswrapper[4760]: I0930 08:03:53.239954 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-vznk5"] Sep 30 08:03:53 crc kubenswrapper[4760]: I0930 08:03:53.300985 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/820e5332-bfcf-4cca-8079-e3d26cc62517-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vznk5\" (UID: \"820e5332-bfcf-4cca-8079-e3d26cc62517\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vznk5" Sep 30 08:03:53 crc kubenswrapper[4760]: I0930 08:03:53.301084 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzks6\" (UniqueName: \"kubernetes.io/projected/820e5332-bfcf-4cca-8079-e3d26cc62517-kube-api-access-gzks6\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vznk5\" (UID: \"820e5332-bfcf-4cca-8079-e3d26cc62517\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vznk5" Sep 30 08:03:53 crc kubenswrapper[4760]: I0930 08:03:53.301152 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/820e5332-bfcf-4cca-8079-e3d26cc62517-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vznk5\" (UID: \"820e5332-bfcf-4cca-8079-e3d26cc62517\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vznk5" Sep 30 08:03:53 crc kubenswrapper[4760]: I0930 08:03:53.403182 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/820e5332-bfcf-4cca-8079-e3d26cc62517-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vznk5\" (UID: \"820e5332-bfcf-4cca-8079-e3d26cc62517\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vznk5" Sep 30 08:03:53 crc kubenswrapper[4760]: I0930 08:03:53.403352 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzks6\" (UniqueName: \"kubernetes.io/projected/820e5332-bfcf-4cca-8079-e3d26cc62517-kube-api-access-gzks6\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vznk5\" (UID: \"820e5332-bfcf-4cca-8079-e3d26cc62517\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vznk5" Sep 30 08:03:53 crc kubenswrapper[4760]: I0930 08:03:53.403448 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/820e5332-bfcf-4cca-8079-e3d26cc62517-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vznk5\" (UID: \"820e5332-bfcf-4cca-8079-e3d26cc62517\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vznk5" Sep 30 08:03:53 crc kubenswrapper[4760]: I0930 08:03:53.408163 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/820e5332-bfcf-4cca-8079-e3d26cc62517-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vznk5\" (UID: \"820e5332-bfcf-4cca-8079-e3d26cc62517\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vznk5" Sep 30 08:03:53 crc kubenswrapper[4760]: I0930 08:03:53.408679 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/820e5332-bfcf-4cca-8079-e3d26cc62517-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vznk5\" (UID: \"820e5332-bfcf-4cca-8079-e3d26cc62517\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vznk5" Sep 30 08:03:53 crc kubenswrapper[4760]: I0930 08:03:53.431658 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzks6\" (UniqueName: \"kubernetes.io/projected/820e5332-bfcf-4cca-8079-e3d26cc62517-kube-api-access-gzks6\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vznk5\" (UID: \"820e5332-bfcf-4cca-8079-e3d26cc62517\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vznk5" Sep 30 08:03:53 crc kubenswrapper[4760]: I0930 08:03:53.548781 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vznk5" Sep 30 08:03:54 crc kubenswrapper[4760]: I0930 08:03:54.070991 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-vznk5"] Sep 30 08:03:54 crc kubenswrapper[4760]: I0930 08:03:54.076202 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 08:03:54 crc kubenswrapper[4760]: I0930 08:03:54.117263 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vznk5" event={"ID":"820e5332-bfcf-4cca-8079-e3d26cc62517","Type":"ContainerStarted","Data":"6511735dd9f5e4f2627ee5b36183a1f9fb7d49dabcd102e05ec428d89a09f8b1"} Sep 30 08:03:55 crc kubenswrapper[4760]: I0930 08:03:55.133934 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vznk5" event={"ID":"820e5332-bfcf-4cca-8079-e3d26cc62517","Type":"ContainerStarted","Data":"b038da9d6b06b28e48022e1f9dfd316cdca788d8e582fd3e3527ad63111229cb"} Sep 30 08:03:55 crc kubenswrapper[4760]: I0930 08:03:55.151566 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vznk5" podStartSLOduration=1.763899514 podStartE2EDuration="2.151544083s" podCreationTimestamp="2025-09-30 08:03:53 +0000 UTC" firstStartedPulling="2025-09-30 08:03:54.075978434 +0000 UTC m=+1819.718884836" lastFinishedPulling="2025-09-30 08:03:54.463622983 +0000 UTC m=+1820.106529405" observedRunningTime="2025-09-30 08:03:55.150189538 +0000 UTC m=+1820.793095960" watchObservedRunningTime="2025-09-30 08:03:55.151544083 +0000 UTC m=+1820.794450495" Sep 30 08:04:04 crc kubenswrapper[4760]: I0930 08:04:04.234787 4760 generic.go:334] "Generic (PLEG): container finished" podID="820e5332-bfcf-4cca-8079-e3d26cc62517" containerID="b038da9d6b06b28e48022e1f9dfd316cdca788d8e582fd3e3527ad63111229cb" exitCode=0 Sep 30 08:04:04 crc kubenswrapper[4760]: I0930 08:04:04.234893 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vznk5" event={"ID":"820e5332-bfcf-4cca-8079-e3d26cc62517","Type":"ContainerDied","Data":"b038da9d6b06b28e48022e1f9dfd316cdca788d8e582fd3e3527ad63111229cb"} Sep 30 08:04:05 crc kubenswrapper[4760]: I0930 08:04:05.665638 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vznk5" Sep 30 08:04:05 crc kubenswrapper[4760]: I0930 08:04:05.770327 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/820e5332-bfcf-4cca-8079-e3d26cc62517-ssh-key\") pod \"820e5332-bfcf-4cca-8079-e3d26cc62517\" (UID: \"820e5332-bfcf-4cca-8079-e3d26cc62517\") " Sep 30 08:04:05 crc kubenswrapper[4760]: I0930 08:04:05.770446 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzks6\" (UniqueName: \"kubernetes.io/projected/820e5332-bfcf-4cca-8079-e3d26cc62517-kube-api-access-gzks6\") pod \"820e5332-bfcf-4cca-8079-e3d26cc62517\" (UID: \"820e5332-bfcf-4cca-8079-e3d26cc62517\") " Sep 30 08:04:05 crc kubenswrapper[4760]: I0930 08:04:05.770751 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/820e5332-bfcf-4cca-8079-e3d26cc62517-inventory\") pod \"820e5332-bfcf-4cca-8079-e3d26cc62517\" (UID: \"820e5332-bfcf-4cca-8079-e3d26cc62517\") " Sep 30 08:04:05 crc kubenswrapper[4760]: I0930 08:04:05.779451 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/820e5332-bfcf-4cca-8079-e3d26cc62517-kube-api-access-gzks6" (OuterVolumeSpecName: "kube-api-access-gzks6") pod "820e5332-bfcf-4cca-8079-e3d26cc62517" (UID: "820e5332-bfcf-4cca-8079-e3d26cc62517"). InnerVolumeSpecName "kube-api-access-gzks6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:04:05 crc kubenswrapper[4760]: I0930 08:04:05.801756 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/820e5332-bfcf-4cca-8079-e3d26cc62517-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "820e5332-bfcf-4cca-8079-e3d26cc62517" (UID: "820e5332-bfcf-4cca-8079-e3d26cc62517"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:04:05 crc kubenswrapper[4760]: I0930 08:04:05.814732 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/820e5332-bfcf-4cca-8079-e3d26cc62517-inventory" (OuterVolumeSpecName: "inventory") pod "820e5332-bfcf-4cca-8079-e3d26cc62517" (UID: "820e5332-bfcf-4cca-8079-e3d26cc62517"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:04:05 crc kubenswrapper[4760]: I0930 08:04:05.873408 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/820e5332-bfcf-4cca-8079-e3d26cc62517-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 08:04:05 crc kubenswrapper[4760]: I0930 08:04:05.873446 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/820e5332-bfcf-4cca-8079-e3d26cc62517-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 08:04:05 crc kubenswrapper[4760]: I0930 08:04:05.873458 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzks6\" (UniqueName: \"kubernetes.io/projected/820e5332-bfcf-4cca-8079-e3d26cc62517-kube-api-access-gzks6\") on node \"crc\" DevicePath \"\"" Sep 30 08:04:06 crc kubenswrapper[4760]: I0930 08:04:06.256414 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vznk5" event={"ID":"820e5332-bfcf-4cca-8079-e3d26cc62517","Type":"ContainerDied","Data":"6511735dd9f5e4f2627ee5b36183a1f9fb7d49dabcd102e05ec428d89a09f8b1"} Sep 30 08:04:06 crc kubenswrapper[4760]: I0930 08:04:06.257095 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6511735dd9f5e4f2627ee5b36183a1f9fb7d49dabcd102e05ec428d89a09f8b1" Sep 30 08:04:06 crc kubenswrapper[4760]: I0930 08:04:06.256510 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vznk5" Sep 30 08:04:06 crc kubenswrapper[4760]: I0930 08:04:06.345247 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs"] Sep 30 08:04:06 crc kubenswrapper[4760]: E0930 08:04:06.345660 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="820e5332-bfcf-4cca-8079-e3d26cc62517" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 08:04:06 crc kubenswrapper[4760]: I0930 08:04:06.345673 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="820e5332-bfcf-4cca-8079-e3d26cc62517" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 08:04:06 crc kubenswrapper[4760]: I0930 08:04:06.345855 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="820e5332-bfcf-4cca-8079-e3d26cc62517" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 08:04:06 crc kubenswrapper[4760]: I0930 08:04:06.346746 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs" Sep 30 08:04:06 crc kubenswrapper[4760]: I0930 08:04:06.349697 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 08:04:06 crc kubenswrapper[4760]: I0930 08:04:06.349775 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8gxrf" Sep 30 08:04:06 crc kubenswrapper[4760]: I0930 08:04:06.351863 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 08:04:06 crc kubenswrapper[4760]: I0930 08:04:06.353334 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 08:04:06 crc kubenswrapper[4760]: I0930 08:04:06.355981 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs"] Sep 30 08:04:06 crc kubenswrapper[4760]: I0930 08:04:06.384490 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9efadf79-7f8c-4a83-9788-6f4f0a5ecd77-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs\" (UID: \"9efadf79-7f8c-4a83-9788-6f4f0a5ecd77\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs" Sep 30 08:04:06 crc kubenswrapper[4760]: I0930 08:04:06.384655 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9efadf79-7f8c-4a83-9788-6f4f0a5ecd77-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs\" (UID: \"9efadf79-7f8c-4a83-9788-6f4f0a5ecd77\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs" Sep 30 08:04:06 crc kubenswrapper[4760]: I0930 08:04:06.384687 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm2mn\" (UniqueName: \"kubernetes.io/projected/9efadf79-7f8c-4a83-9788-6f4f0a5ecd77-kube-api-access-bm2mn\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs\" (UID: \"9efadf79-7f8c-4a83-9788-6f4f0a5ecd77\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs" Sep 30 08:04:06 crc kubenswrapper[4760]: I0930 08:04:06.486412 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9efadf79-7f8c-4a83-9788-6f4f0a5ecd77-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs\" (UID: \"9efadf79-7f8c-4a83-9788-6f4f0a5ecd77\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs" Sep 30 08:04:06 crc kubenswrapper[4760]: I0930 08:04:06.486533 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9efadf79-7f8c-4a83-9788-6f4f0a5ecd77-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs\" (UID: \"9efadf79-7f8c-4a83-9788-6f4f0a5ecd77\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs" Sep 30 08:04:06 crc kubenswrapper[4760]: I0930 08:04:06.486560 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm2mn\" (UniqueName: \"kubernetes.io/projected/9efadf79-7f8c-4a83-9788-6f4f0a5ecd77-kube-api-access-bm2mn\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs\" (UID: \"9efadf79-7f8c-4a83-9788-6f4f0a5ecd77\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs" Sep 30 08:04:06 crc kubenswrapper[4760]: I0930 08:04:06.492569 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9efadf79-7f8c-4a83-9788-6f4f0a5ecd77-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs\" (UID: \"9efadf79-7f8c-4a83-9788-6f4f0a5ecd77\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs" Sep 30 08:04:06 crc kubenswrapper[4760]: I0930 08:04:06.492916 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9efadf79-7f8c-4a83-9788-6f4f0a5ecd77-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs\" (UID: \"9efadf79-7f8c-4a83-9788-6f4f0a5ecd77\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs" Sep 30 08:04:06 crc kubenswrapper[4760]: I0930 08:04:06.503218 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm2mn\" (UniqueName: \"kubernetes.io/projected/9efadf79-7f8c-4a83-9788-6f4f0a5ecd77-kube-api-access-bm2mn\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs\" (UID: \"9efadf79-7f8c-4a83-9788-6f4f0a5ecd77\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs" Sep 30 08:04:06 crc kubenswrapper[4760]: I0930 08:04:06.665255 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs" Sep 30 08:04:07 crc kubenswrapper[4760]: I0930 08:04:07.219984 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs"] Sep 30 08:04:07 crc kubenswrapper[4760]: W0930 08:04:07.222669 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9efadf79_7f8c_4a83_9788_6f4f0a5ecd77.slice/crio-0c3d8c58ca3a823479f8fe0f8009d25cade72e48fe98db4af713b4c20b6e53b0 WatchSource:0}: Error finding container 0c3d8c58ca3a823479f8fe0f8009d25cade72e48fe98db4af713b4c20b6e53b0: Status 404 returned error can't find the container with id 0c3d8c58ca3a823479f8fe0f8009d25cade72e48fe98db4af713b4c20b6e53b0 Sep 30 08:04:07 crc kubenswrapper[4760]: I0930 08:04:07.266081 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs" event={"ID":"9efadf79-7f8c-4a83-9788-6f4f0a5ecd77","Type":"ContainerStarted","Data":"0c3d8c58ca3a823479f8fe0f8009d25cade72e48fe98db4af713b4c20b6e53b0"} Sep 30 08:04:08 crc kubenswrapper[4760]: I0930 08:04:08.280683 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs" event={"ID":"9efadf79-7f8c-4a83-9788-6f4f0a5ecd77","Type":"ContainerStarted","Data":"881ebdc93f76e403830b574cecdb7ab0c5594f1590dd7a44e78f94c56974911b"} Sep 30 08:04:08 crc kubenswrapper[4760]: I0930 08:04:08.309621 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs" podStartSLOduration=1.839113228 podStartE2EDuration="2.309596138s" podCreationTimestamp="2025-09-30 08:04:06 +0000 UTC" firstStartedPulling="2025-09-30 08:04:07.226078516 +0000 UTC m=+1832.868984928" lastFinishedPulling="2025-09-30 08:04:07.696561426 +0000 UTC m=+1833.339467838" observedRunningTime="2025-09-30 08:04:08.298528964 +0000 UTC m=+1833.941435376" watchObservedRunningTime="2025-09-30 08:04:08.309596138 +0000 UTC m=+1833.952502550" Sep 30 08:04:18 crc kubenswrapper[4760]: I0930 08:04:18.375846 4760 generic.go:334] "Generic (PLEG): container finished" podID="9efadf79-7f8c-4a83-9788-6f4f0a5ecd77" containerID="881ebdc93f76e403830b574cecdb7ab0c5594f1590dd7a44e78f94c56974911b" exitCode=0 Sep 30 08:04:18 crc kubenswrapper[4760]: I0930 08:04:18.375921 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs" event={"ID":"9efadf79-7f8c-4a83-9788-6f4f0a5ecd77","Type":"ContainerDied","Data":"881ebdc93f76e403830b574cecdb7ab0c5594f1590dd7a44e78f94c56974911b"} Sep 30 08:04:19 crc kubenswrapper[4760]: I0930 08:04:19.863128 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs" Sep 30 08:04:19 crc kubenswrapper[4760]: I0930 08:04:19.971361 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9efadf79-7f8c-4a83-9788-6f4f0a5ecd77-ssh-key\") pod \"9efadf79-7f8c-4a83-9788-6f4f0a5ecd77\" (UID: \"9efadf79-7f8c-4a83-9788-6f4f0a5ecd77\") " Sep 30 08:04:19 crc kubenswrapper[4760]: I0930 08:04:19.972828 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bm2mn\" (UniqueName: \"kubernetes.io/projected/9efadf79-7f8c-4a83-9788-6f4f0a5ecd77-kube-api-access-bm2mn\") pod \"9efadf79-7f8c-4a83-9788-6f4f0a5ecd77\" (UID: \"9efadf79-7f8c-4a83-9788-6f4f0a5ecd77\") " Sep 30 08:04:19 crc kubenswrapper[4760]: I0930 08:04:19.973034 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9efadf79-7f8c-4a83-9788-6f4f0a5ecd77-inventory\") pod \"9efadf79-7f8c-4a83-9788-6f4f0a5ecd77\" (UID: \"9efadf79-7f8c-4a83-9788-6f4f0a5ecd77\") " Sep 30 08:04:19 crc kubenswrapper[4760]: I0930 08:04:19.986545 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9efadf79-7f8c-4a83-9788-6f4f0a5ecd77-kube-api-access-bm2mn" (OuterVolumeSpecName: "kube-api-access-bm2mn") pod "9efadf79-7f8c-4a83-9788-6f4f0a5ecd77" (UID: "9efadf79-7f8c-4a83-9788-6f4f0a5ecd77"). InnerVolumeSpecName "kube-api-access-bm2mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.034483 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9efadf79-7f8c-4a83-9788-6f4f0a5ecd77-inventory" (OuterVolumeSpecName: "inventory") pod "9efadf79-7f8c-4a83-9788-6f4f0a5ecd77" (UID: "9efadf79-7f8c-4a83-9788-6f4f0a5ecd77"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.040607 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9efadf79-7f8c-4a83-9788-6f4f0a5ecd77-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9efadf79-7f8c-4a83-9788-6f4f0a5ecd77" (UID: "9efadf79-7f8c-4a83-9788-6f4f0a5ecd77"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.077698 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9efadf79-7f8c-4a83-9788-6f4f0a5ecd77-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.077993 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bm2mn\" (UniqueName: \"kubernetes.io/projected/9efadf79-7f8c-4a83-9788-6f4f0a5ecd77-kube-api-access-bm2mn\") on node \"crc\" DevicePath \"\"" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.078007 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9efadf79-7f8c-4a83-9788-6f4f0a5ecd77-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.419237 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs" event={"ID":"9efadf79-7f8c-4a83-9788-6f4f0a5ecd77","Type":"ContainerDied","Data":"0c3d8c58ca3a823479f8fe0f8009d25cade72e48fe98db4af713b4c20b6e53b0"} Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.419344 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c3d8c58ca3a823479f8fe0f8009d25cade72e48fe98db4af713b4c20b6e53b0" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.419272 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.517462 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v"] Sep 30 08:04:20 crc kubenswrapper[4760]: E0930 08:04:20.517912 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9efadf79-7f8c-4a83-9788-6f4f0a5ecd77" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.517935 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="9efadf79-7f8c-4a83-9788-6f4f0a5ecd77" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.518196 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="9efadf79-7f8c-4a83-9788-6f4f0a5ecd77" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.519050 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.522006 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.522760 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.523914 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8gxrf" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.525043 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.525315 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.525533 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.525659 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.525756 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v"] Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.525781 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.587366 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.587421 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.587465 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.587494 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.587518 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7p9w\" (UniqueName: \"kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-kube-api-access-m7p9w\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.587628 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.587800 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.587974 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.588044 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.588086 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.588116 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.588266 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.588339 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.588461 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.690788 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.690848 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.690882 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.690938 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.690967 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.691044 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.691083 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.691113 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.691150 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.691188 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.691212 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7p9w\" (UniqueName: \"kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-kube-api-access-m7p9w\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.691242 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.691331 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.691429 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.695238 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.695925 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.696046 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.698160 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.698368 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.698888 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.699739 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.699769 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.700713 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.701542 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.701706 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.702483 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.703605 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.714206 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7p9w\" (UniqueName: \"kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-kube-api-access-m7p9w\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:20 crc kubenswrapper[4760]: I0930 08:04:20.837476 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:04:21 crc kubenswrapper[4760]: I0930 08:04:21.429007 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v"] Sep 30 08:04:22 crc kubenswrapper[4760]: I0930 08:04:22.439832 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" event={"ID":"e0989b93-a567-4aa1-886e-43b6fa827891","Type":"ContainerStarted","Data":"99a5df57857306d6709a148dd66248bc4748ba3f1d86204b513c0a2018d2bb6d"} Sep 30 08:04:22 crc kubenswrapper[4760]: I0930 08:04:22.440212 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" event={"ID":"e0989b93-a567-4aa1-886e-43b6fa827891","Type":"ContainerStarted","Data":"977f640710ac60dbabcbe02c96ff3a9c5e8910bfc4cf6c1ac4546c212a0e6ede"} Sep 30 08:04:22 crc kubenswrapper[4760]: I0930 08:04:22.464539 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" podStartSLOduration=1.88994728 podStartE2EDuration="2.464522226s" podCreationTimestamp="2025-09-30 08:04:20 +0000 UTC" firstStartedPulling="2025-09-30 08:04:21.437716267 +0000 UTC m=+1847.080622689" lastFinishedPulling="2025-09-30 08:04:22.012291213 +0000 UTC m=+1847.655197635" observedRunningTime="2025-09-30 08:04:22.460318819 +0000 UTC m=+1848.103225251" watchObservedRunningTime="2025-09-30 08:04:22.464522226 +0000 UTC m=+1848.107428638" Sep 30 08:05:06 crc kubenswrapper[4760]: I0930 08:05:06.891727 4760 generic.go:334] "Generic (PLEG): container finished" podID="e0989b93-a567-4aa1-886e-43b6fa827891" containerID="99a5df57857306d6709a148dd66248bc4748ba3f1d86204b513c0a2018d2bb6d" exitCode=0 Sep 30 08:05:06 crc kubenswrapper[4760]: I0930 08:05:06.891805 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" event={"ID":"e0989b93-a567-4aa1-886e-43b6fa827891","Type":"ContainerDied","Data":"99a5df57857306d6709a148dd66248bc4748ba3f1d86204b513c0a2018d2bb6d"} Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.341520 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.418619 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"e0989b93-a567-4aa1-886e-43b6fa827891\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.418675 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-openstack-edpm-ipam-ovn-default-certs-0\") pod \"e0989b93-a567-4aa1-886e-43b6fa827891\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.418724 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-inventory\") pod \"e0989b93-a567-4aa1-886e-43b6fa827891\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.418747 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-libvirt-combined-ca-bundle\") pod \"e0989b93-a567-4aa1-886e-43b6fa827891\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.418872 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-bootstrap-combined-ca-bundle\") pod \"e0989b93-a567-4aa1-886e-43b6fa827891\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.419745 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7p9w\" (UniqueName: \"kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-kube-api-access-m7p9w\") pod \"e0989b93-a567-4aa1-886e-43b6fa827891\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.419776 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-repo-setup-combined-ca-bundle\") pod \"e0989b93-a567-4aa1-886e-43b6fa827891\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.419810 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-ssh-key\") pod \"e0989b93-a567-4aa1-886e-43b6fa827891\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.419875 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-telemetry-combined-ca-bundle\") pod \"e0989b93-a567-4aa1-886e-43b6fa827891\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.420400 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-nova-combined-ca-bundle\") pod \"e0989b93-a567-4aa1-886e-43b6fa827891\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.420433 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"e0989b93-a567-4aa1-886e-43b6fa827891\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.420463 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-ovn-combined-ca-bundle\") pod \"e0989b93-a567-4aa1-886e-43b6fa827891\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.420488 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"e0989b93-a567-4aa1-886e-43b6fa827891\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.420546 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-neutron-metadata-combined-ca-bundle\") pod \"e0989b93-a567-4aa1-886e-43b6fa827891\" (UID: \"e0989b93-a567-4aa1-886e-43b6fa827891\") " Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.428179 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "e0989b93-a567-4aa1-886e-43b6fa827891" (UID: "e0989b93-a567-4aa1-886e-43b6fa827891"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.429222 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e0989b93-a567-4aa1-886e-43b6fa827891" (UID: "e0989b93-a567-4aa1-886e-43b6fa827891"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.429409 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "e0989b93-a567-4aa1-886e-43b6fa827891" (UID: "e0989b93-a567-4aa1-886e-43b6fa827891"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.429464 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "e0989b93-a567-4aa1-886e-43b6fa827891" (UID: "e0989b93-a567-4aa1-886e-43b6fa827891"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.429921 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "e0989b93-a567-4aa1-886e-43b6fa827891" (UID: "e0989b93-a567-4aa1-886e-43b6fa827891"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.431421 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-kube-api-access-m7p9w" (OuterVolumeSpecName: "kube-api-access-m7p9w") pod "e0989b93-a567-4aa1-886e-43b6fa827891" (UID: "e0989b93-a567-4aa1-886e-43b6fa827891"). InnerVolumeSpecName "kube-api-access-m7p9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.431495 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e0989b93-a567-4aa1-886e-43b6fa827891" (UID: "e0989b93-a567-4aa1-886e-43b6fa827891"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.431968 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "e0989b93-a567-4aa1-886e-43b6fa827891" (UID: "e0989b93-a567-4aa1-886e-43b6fa827891"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.432024 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "e0989b93-a567-4aa1-886e-43b6fa827891" (UID: "e0989b93-a567-4aa1-886e-43b6fa827891"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.432045 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "e0989b93-a567-4aa1-886e-43b6fa827891" (UID: "e0989b93-a567-4aa1-886e-43b6fa827891"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.432575 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "e0989b93-a567-4aa1-886e-43b6fa827891" (UID: "e0989b93-a567-4aa1-886e-43b6fa827891"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.446548 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "e0989b93-a567-4aa1-886e-43b6fa827891" (UID: "e0989b93-a567-4aa1-886e-43b6fa827891"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.455421 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e0989b93-a567-4aa1-886e-43b6fa827891" (UID: "e0989b93-a567-4aa1-886e-43b6fa827891"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.463421 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-inventory" (OuterVolumeSpecName: "inventory") pod "e0989b93-a567-4aa1-886e-43b6fa827891" (UID: "e0989b93-a567-4aa1-886e-43b6fa827891"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.523720 4760 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.523771 4760 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.523789 4760 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.523808 4760 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.523831 4760 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.523848 4760 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.523868 4760 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.523885 4760 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.523902 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.523918 4760 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.523935 4760 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.523950 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7p9w\" (UniqueName: \"kubernetes.io/projected/e0989b93-a567-4aa1-886e-43b6fa827891-kube-api-access-m7p9w\") on node \"crc\" DevicePath \"\"" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.523966 4760 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.523981 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0989b93-a567-4aa1-886e-43b6fa827891-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.916849 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" event={"ID":"e0989b93-a567-4aa1-886e-43b6fa827891","Type":"ContainerDied","Data":"977f640710ac60dbabcbe02c96ff3a9c5e8910bfc4cf6c1ac4546c212a0e6ede"} Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.917498 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="977f640710ac60dbabcbe02c96ff3a9c5e8910bfc4cf6c1ac4546c212a0e6ede" Sep 30 08:05:08 crc kubenswrapper[4760]: I0930 08:05:08.916925 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v" Sep 30 08:05:09 crc kubenswrapper[4760]: I0930 08:05:09.131412 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-69ht5"] Sep 30 08:05:09 crc kubenswrapper[4760]: E0930 08:05:09.132031 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0989b93-a567-4aa1-886e-43b6fa827891" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 30 08:05:09 crc kubenswrapper[4760]: I0930 08:05:09.132100 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0989b93-a567-4aa1-886e-43b6fa827891" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 30 08:05:09 crc kubenswrapper[4760]: I0930 08:05:09.132346 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0989b93-a567-4aa1-886e-43b6fa827891" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 30 08:05:09 crc kubenswrapper[4760]: I0930 08:05:09.133045 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-69ht5" Sep 30 08:05:09 crc kubenswrapper[4760]: I0930 08:05:09.135571 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 08:05:09 crc kubenswrapper[4760]: I0930 08:05:09.135951 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 08:05:09 crc kubenswrapper[4760]: I0930 08:05:09.136423 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 08:05:09 crc kubenswrapper[4760]: I0930 08:05:09.136788 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8gxrf" Sep 30 08:05:09 crc kubenswrapper[4760]: I0930 08:05:09.137051 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Sep 30 08:05:09 crc kubenswrapper[4760]: I0930 08:05:09.158848 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-69ht5"] Sep 30 08:05:09 crc kubenswrapper[4760]: I0930 08:05:09.238182 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0f077fda-e7af-42a5-9d0b-f007910f6948-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-69ht5\" (UID: \"0f077fda-e7af-42a5-9d0b-f007910f6948\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-69ht5" Sep 30 08:05:09 crc kubenswrapper[4760]: I0930 08:05:09.238555 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psz2l\" (UniqueName: \"kubernetes.io/projected/0f077fda-e7af-42a5-9d0b-f007910f6948-kube-api-access-psz2l\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-69ht5\" (UID: \"0f077fda-e7af-42a5-9d0b-f007910f6948\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-69ht5" Sep 30 08:05:09 crc kubenswrapper[4760]: I0930 08:05:09.238686 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f077fda-e7af-42a5-9d0b-f007910f6948-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-69ht5\" (UID: \"0f077fda-e7af-42a5-9d0b-f007910f6948\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-69ht5" Sep 30 08:05:09 crc kubenswrapper[4760]: I0930 08:05:09.238912 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f077fda-e7af-42a5-9d0b-f007910f6948-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-69ht5\" (UID: \"0f077fda-e7af-42a5-9d0b-f007910f6948\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-69ht5" Sep 30 08:05:09 crc kubenswrapper[4760]: I0930 08:05:09.239084 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f077fda-e7af-42a5-9d0b-f007910f6948-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-69ht5\" (UID: \"0f077fda-e7af-42a5-9d0b-f007910f6948\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-69ht5" Sep 30 08:05:09 crc kubenswrapper[4760]: I0930 08:05:09.341259 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f077fda-e7af-42a5-9d0b-f007910f6948-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-69ht5\" (UID: \"0f077fda-e7af-42a5-9d0b-f007910f6948\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-69ht5" Sep 30 08:05:09 crc kubenswrapper[4760]: I0930 08:05:09.341379 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f077fda-e7af-42a5-9d0b-f007910f6948-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-69ht5\" (UID: \"0f077fda-e7af-42a5-9d0b-f007910f6948\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-69ht5" Sep 30 08:05:09 crc kubenswrapper[4760]: I0930 08:05:09.341440 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0f077fda-e7af-42a5-9d0b-f007910f6948-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-69ht5\" (UID: \"0f077fda-e7af-42a5-9d0b-f007910f6948\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-69ht5" Sep 30 08:05:09 crc kubenswrapper[4760]: I0930 08:05:09.341466 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psz2l\" (UniqueName: \"kubernetes.io/projected/0f077fda-e7af-42a5-9d0b-f007910f6948-kube-api-access-psz2l\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-69ht5\" (UID: \"0f077fda-e7af-42a5-9d0b-f007910f6948\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-69ht5" Sep 30 08:05:09 crc kubenswrapper[4760]: I0930 08:05:09.341491 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f077fda-e7af-42a5-9d0b-f007910f6948-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-69ht5\" (UID: \"0f077fda-e7af-42a5-9d0b-f007910f6948\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-69ht5" Sep 30 08:05:09 crc kubenswrapper[4760]: I0930 08:05:09.342333 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0f077fda-e7af-42a5-9d0b-f007910f6948-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-69ht5\" (UID: \"0f077fda-e7af-42a5-9d0b-f007910f6948\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-69ht5" Sep 30 08:05:09 crc kubenswrapper[4760]: I0930 08:05:09.346695 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f077fda-e7af-42a5-9d0b-f007910f6948-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-69ht5\" (UID: \"0f077fda-e7af-42a5-9d0b-f007910f6948\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-69ht5" Sep 30 08:05:09 crc kubenswrapper[4760]: I0930 08:05:09.347877 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f077fda-e7af-42a5-9d0b-f007910f6948-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-69ht5\" (UID: \"0f077fda-e7af-42a5-9d0b-f007910f6948\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-69ht5" Sep 30 08:05:09 crc kubenswrapper[4760]: I0930 08:05:09.348680 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f077fda-e7af-42a5-9d0b-f007910f6948-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-69ht5\" (UID: \"0f077fda-e7af-42a5-9d0b-f007910f6948\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-69ht5" Sep 30 08:05:09 crc kubenswrapper[4760]: I0930 08:05:09.363762 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psz2l\" (UniqueName: \"kubernetes.io/projected/0f077fda-e7af-42a5-9d0b-f007910f6948-kube-api-access-psz2l\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-69ht5\" (UID: \"0f077fda-e7af-42a5-9d0b-f007910f6948\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-69ht5" Sep 30 08:05:09 crc kubenswrapper[4760]: I0930 08:05:09.454085 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-69ht5" Sep 30 08:05:09 crc kubenswrapper[4760]: I0930 08:05:09.989960 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-69ht5"] Sep 30 08:05:10 crc kubenswrapper[4760]: I0930 08:05:10.941131 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-69ht5" event={"ID":"0f077fda-e7af-42a5-9d0b-f007910f6948","Type":"ContainerStarted","Data":"ba2ae036254c5932a526c2a3e831a67ba294523c9681e668e436f6863c2f3c89"} Sep 30 08:05:10 crc kubenswrapper[4760]: I0930 08:05:10.941671 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-69ht5" event={"ID":"0f077fda-e7af-42a5-9d0b-f007910f6948","Type":"ContainerStarted","Data":"115479297cc6746b8403c964f746b70d005d12393ae136a633eebbec681344c7"} Sep 30 08:05:10 crc kubenswrapper[4760]: I0930 08:05:10.963910 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-69ht5" podStartSLOduration=1.5129805699999999 podStartE2EDuration="1.963891899s" podCreationTimestamp="2025-09-30 08:05:09 +0000 UTC" firstStartedPulling="2025-09-30 08:05:09.996422299 +0000 UTC m=+1895.639328721" lastFinishedPulling="2025-09-30 08:05:10.447333638 +0000 UTC m=+1896.090240050" observedRunningTime="2025-09-30 08:05:10.957568217 +0000 UTC m=+1896.600474649" watchObservedRunningTime="2025-09-30 08:05:10.963891899 +0000 UTC m=+1896.606798311" Sep 30 08:05:49 crc kubenswrapper[4760]: I0930 08:05:49.112895 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:05:49 crc kubenswrapper[4760]: I0930 08:05:49.113763 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:06:19 crc kubenswrapper[4760]: I0930 08:06:19.113155 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:06:19 crc kubenswrapper[4760]: I0930 08:06:19.113799 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:06:23 crc kubenswrapper[4760]: I0930 08:06:23.645825 4760 generic.go:334] "Generic (PLEG): container finished" podID="0f077fda-e7af-42a5-9d0b-f007910f6948" containerID="ba2ae036254c5932a526c2a3e831a67ba294523c9681e668e436f6863c2f3c89" exitCode=0 Sep 30 08:06:23 crc kubenswrapper[4760]: I0930 08:06:23.645940 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-69ht5" event={"ID":"0f077fda-e7af-42a5-9d0b-f007910f6948","Type":"ContainerDied","Data":"ba2ae036254c5932a526c2a3e831a67ba294523c9681e668e436f6863c2f3c89"} Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.178923 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-69ht5" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.212111 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f077fda-e7af-42a5-9d0b-f007910f6948-ovn-combined-ca-bundle\") pod \"0f077fda-e7af-42a5-9d0b-f007910f6948\" (UID: \"0f077fda-e7af-42a5-9d0b-f007910f6948\") " Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.212379 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f077fda-e7af-42a5-9d0b-f007910f6948-inventory\") pod \"0f077fda-e7af-42a5-9d0b-f007910f6948\" (UID: \"0f077fda-e7af-42a5-9d0b-f007910f6948\") " Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.219254 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f077fda-e7af-42a5-9d0b-f007910f6948-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "0f077fda-e7af-42a5-9d0b-f007910f6948" (UID: "0f077fda-e7af-42a5-9d0b-f007910f6948"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.245523 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f077fda-e7af-42a5-9d0b-f007910f6948-inventory" (OuterVolumeSpecName: "inventory") pod "0f077fda-e7af-42a5-9d0b-f007910f6948" (UID: "0f077fda-e7af-42a5-9d0b-f007910f6948"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.314703 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f077fda-e7af-42a5-9d0b-f007910f6948-ssh-key\") pod \"0f077fda-e7af-42a5-9d0b-f007910f6948\" (UID: \"0f077fda-e7af-42a5-9d0b-f007910f6948\") " Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.314775 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psz2l\" (UniqueName: \"kubernetes.io/projected/0f077fda-e7af-42a5-9d0b-f007910f6948-kube-api-access-psz2l\") pod \"0f077fda-e7af-42a5-9d0b-f007910f6948\" (UID: \"0f077fda-e7af-42a5-9d0b-f007910f6948\") " Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.314869 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0f077fda-e7af-42a5-9d0b-f007910f6948-ovncontroller-config-0\") pod \"0f077fda-e7af-42a5-9d0b-f007910f6948\" (UID: \"0f077fda-e7af-42a5-9d0b-f007910f6948\") " Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.315380 4760 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f077fda-e7af-42a5-9d0b-f007910f6948-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.315406 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f077fda-e7af-42a5-9d0b-f007910f6948-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.317599 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f077fda-e7af-42a5-9d0b-f007910f6948-kube-api-access-psz2l" (OuterVolumeSpecName: "kube-api-access-psz2l") pod "0f077fda-e7af-42a5-9d0b-f007910f6948" (UID: "0f077fda-e7af-42a5-9d0b-f007910f6948"). InnerVolumeSpecName "kube-api-access-psz2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.341841 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f077fda-e7af-42a5-9d0b-f007910f6948-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "0f077fda-e7af-42a5-9d0b-f007910f6948" (UID: "0f077fda-e7af-42a5-9d0b-f007910f6948"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.366854 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f077fda-e7af-42a5-9d0b-f007910f6948-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0f077fda-e7af-42a5-9d0b-f007910f6948" (UID: "0f077fda-e7af-42a5-9d0b-f007910f6948"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.416927 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f077fda-e7af-42a5-9d0b-f007910f6948-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.416985 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psz2l\" (UniqueName: \"kubernetes.io/projected/0f077fda-e7af-42a5-9d0b-f007910f6948-kube-api-access-psz2l\") on node \"crc\" DevicePath \"\"" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.416998 4760 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0f077fda-e7af-42a5-9d0b-f007910f6948-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.672797 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-69ht5" event={"ID":"0f077fda-e7af-42a5-9d0b-f007910f6948","Type":"ContainerDied","Data":"115479297cc6746b8403c964f746b70d005d12393ae136a633eebbec681344c7"} Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.672857 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-69ht5" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.672871 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="115479297cc6746b8403c964f746b70d005d12393ae136a633eebbec681344c7" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.813771 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4"] Sep 30 08:06:25 crc kubenswrapper[4760]: E0930 08:06:25.814194 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f077fda-e7af-42a5-9d0b-f007910f6948" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.814212 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f077fda-e7af-42a5-9d0b-f007910f6948" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.814432 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f077fda-e7af-42a5-9d0b-f007910f6948" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.815089 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.818249 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.818503 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.818748 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8gxrf" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.818914 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.819097 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.819237 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.823715 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4\" (UID: \"5259c092-63b5-4574-b14a-725c45523773\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.823779 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4\" (UID: \"5259c092-63b5-4574-b14a-725c45523773\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.823870 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gkfx\" (UniqueName: \"kubernetes.io/projected/5259c092-63b5-4574-b14a-725c45523773-kube-api-access-5gkfx\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4\" (UID: \"5259c092-63b5-4574-b14a-725c45523773\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.823928 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4\" (UID: \"5259c092-63b5-4574-b14a-725c45523773\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.824207 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4\" (UID: \"5259c092-63b5-4574-b14a-725c45523773\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.824431 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4\" (UID: \"5259c092-63b5-4574-b14a-725c45523773\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.829271 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4"] Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.926270 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gkfx\" (UniqueName: \"kubernetes.io/projected/5259c092-63b5-4574-b14a-725c45523773-kube-api-access-5gkfx\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4\" (UID: \"5259c092-63b5-4574-b14a-725c45523773\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.926728 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4\" (UID: \"5259c092-63b5-4574-b14a-725c45523773\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.926898 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4\" (UID: \"5259c092-63b5-4574-b14a-725c45523773\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.927115 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4\" (UID: \"5259c092-63b5-4574-b14a-725c45523773\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.927583 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4\" (UID: \"5259c092-63b5-4574-b14a-725c45523773\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.927716 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4\" (UID: \"5259c092-63b5-4574-b14a-725c45523773\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.932639 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4\" (UID: \"5259c092-63b5-4574-b14a-725c45523773\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.933225 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4\" (UID: \"5259c092-63b5-4574-b14a-725c45523773\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.934441 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4\" (UID: \"5259c092-63b5-4574-b14a-725c45523773\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.935120 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4\" (UID: \"5259c092-63b5-4574-b14a-725c45523773\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.935177 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4\" (UID: \"5259c092-63b5-4574-b14a-725c45523773\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4" Sep 30 08:06:25 crc kubenswrapper[4760]: I0930 08:06:25.945239 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gkfx\" (UniqueName: \"kubernetes.io/projected/5259c092-63b5-4574-b14a-725c45523773-kube-api-access-5gkfx\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4\" (UID: \"5259c092-63b5-4574-b14a-725c45523773\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4" Sep 30 08:06:26 crc kubenswrapper[4760]: I0930 08:06:26.135156 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4" Sep 30 08:06:26 crc kubenswrapper[4760]: I0930 08:06:26.635857 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4"] Sep 30 08:06:26 crc kubenswrapper[4760]: I0930 08:06:26.680936 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4" event={"ID":"5259c092-63b5-4574-b14a-725c45523773","Type":"ContainerStarted","Data":"9544840171ae54d2ef445a401883676771a3d5a91a4a4edcfa549f6cc6efd676"} Sep 30 08:06:27 crc kubenswrapper[4760]: I0930 08:06:27.691533 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4" event={"ID":"5259c092-63b5-4574-b14a-725c45523773","Type":"ContainerStarted","Data":"f07303596f8dc98b339075f3a5ab8038201953718f90031d62f795fbe8caf388"} Sep 30 08:06:27 crc kubenswrapper[4760]: I0930 08:06:27.715473 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4" podStartSLOduration=2.181670989 podStartE2EDuration="2.715449091s" podCreationTimestamp="2025-09-30 08:06:25 +0000 UTC" firstStartedPulling="2025-09-30 08:06:26.644267674 +0000 UTC m=+1972.287174086" lastFinishedPulling="2025-09-30 08:06:27.178045776 +0000 UTC m=+1972.820952188" observedRunningTime="2025-09-30 08:06:27.708229546 +0000 UTC m=+1973.351135988" watchObservedRunningTime="2025-09-30 08:06:27.715449091 +0000 UTC m=+1973.358355503" Sep 30 08:06:49 crc kubenswrapper[4760]: I0930 08:06:49.113181 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:06:49 crc kubenswrapper[4760]: I0930 08:06:49.113822 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:06:49 crc kubenswrapper[4760]: I0930 08:06:49.113897 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 08:06:49 crc kubenswrapper[4760]: I0930 08:06:49.114717 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"96fa7f3155734522f8d82f257a6244d3c4526c33face1ad5feeb1f8274d4d3e8"} pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 08:06:49 crc kubenswrapper[4760]: I0930 08:06:49.114781 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" containerID="cri-o://96fa7f3155734522f8d82f257a6244d3c4526c33face1ad5feeb1f8274d4d3e8" gracePeriod=600 Sep 30 08:06:49 crc kubenswrapper[4760]: I0930 08:06:49.915912 4760 generic.go:334] "Generic (PLEG): container finished" podID="7a9c8270-6964-4886-87d0-227b05b76da4" containerID="96fa7f3155734522f8d82f257a6244d3c4526c33face1ad5feeb1f8274d4d3e8" exitCode=0 Sep 30 08:06:49 crc kubenswrapper[4760]: I0930 08:06:49.915987 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerDied","Data":"96fa7f3155734522f8d82f257a6244d3c4526c33face1ad5feeb1f8274d4d3e8"} Sep 30 08:06:49 crc kubenswrapper[4760]: I0930 08:06:49.916532 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerStarted","Data":"78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b"} Sep 30 08:06:49 crc kubenswrapper[4760]: I0930 08:06:49.916578 4760 scope.go:117] "RemoveContainer" containerID="6e360b7dc465c02adf99299c3bccec940fec7c45f12ed3788ad43373bef9d4f8" Sep 30 08:07:22 crc kubenswrapper[4760]: I0930 08:07:22.242705 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4" event={"ID":"5259c092-63b5-4574-b14a-725c45523773","Type":"ContainerDied","Data":"f07303596f8dc98b339075f3a5ab8038201953718f90031d62f795fbe8caf388"} Sep 30 08:07:22 crc kubenswrapper[4760]: I0930 08:07:22.242653 4760 generic.go:334] "Generic (PLEG): container finished" podID="5259c092-63b5-4574-b14a-725c45523773" containerID="f07303596f8dc98b339075f3a5ab8038201953718f90031d62f795fbe8caf388" exitCode=0 Sep 30 08:07:23 crc kubenswrapper[4760]: I0930 08:07:23.750890 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4" Sep 30 08:07:23 crc kubenswrapper[4760]: I0930 08:07:23.870993 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-ssh-key\") pod \"5259c092-63b5-4574-b14a-725c45523773\" (UID: \"5259c092-63b5-4574-b14a-725c45523773\") " Sep 30 08:07:23 crc kubenswrapper[4760]: I0930 08:07:23.871346 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-neutron-ovn-metadata-agent-neutron-config-0\") pod \"5259c092-63b5-4574-b14a-725c45523773\" (UID: \"5259c092-63b5-4574-b14a-725c45523773\") " Sep 30 08:07:23 crc kubenswrapper[4760]: I0930 08:07:23.871411 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-inventory\") pod \"5259c092-63b5-4574-b14a-725c45523773\" (UID: \"5259c092-63b5-4574-b14a-725c45523773\") " Sep 30 08:07:23 crc kubenswrapper[4760]: I0930 08:07:23.871517 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-neutron-metadata-combined-ca-bundle\") pod \"5259c092-63b5-4574-b14a-725c45523773\" (UID: \"5259c092-63b5-4574-b14a-725c45523773\") " Sep 30 08:07:23 crc kubenswrapper[4760]: I0930 08:07:23.871604 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gkfx\" (UniqueName: \"kubernetes.io/projected/5259c092-63b5-4574-b14a-725c45523773-kube-api-access-5gkfx\") pod \"5259c092-63b5-4574-b14a-725c45523773\" (UID: \"5259c092-63b5-4574-b14a-725c45523773\") " Sep 30 08:07:23 crc kubenswrapper[4760]: I0930 08:07:23.871653 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-nova-metadata-neutron-config-0\") pod \"5259c092-63b5-4574-b14a-725c45523773\" (UID: \"5259c092-63b5-4574-b14a-725c45523773\") " Sep 30 08:07:23 crc kubenswrapper[4760]: I0930 08:07:23.878519 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "5259c092-63b5-4574-b14a-725c45523773" (UID: "5259c092-63b5-4574-b14a-725c45523773"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:07:23 crc kubenswrapper[4760]: I0930 08:07:23.882134 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5259c092-63b5-4574-b14a-725c45523773-kube-api-access-5gkfx" (OuterVolumeSpecName: "kube-api-access-5gkfx") pod "5259c092-63b5-4574-b14a-725c45523773" (UID: "5259c092-63b5-4574-b14a-725c45523773"). InnerVolumeSpecName "kube-api-access-5gkfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:07:23 crc kubenswrapper[4760]: I0930 08:07:23.913454 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "5259c092-63b5-4574-b14a-725c45523773" (UID: "5259c092-63b5-4574-b14a-725c45523773"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:07:23 crc kubenswrapper[4760]: I0930 08:07:23.924954 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5259c092-63b5-4574-b14a-725c45523773" (UID: "5259c092-63b5-4574-b14a-725c45523773"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:07:23 crc kubenswrapper[4760]: I0930 08:07:23.929466 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-inventory" (OuterVolumeSpecName: "inventory") pod "5259c092-63b5-4574-b14a-725c45523773" (UID: "5259c092-63b5-4574-b14a-725c45523773"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:07:23 crc kubenswrapper[4760]: I0930 08:07:23.956598 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "5259c092-63b5-4574-b14a-725c45523773" (UID: "5259c092-63b5-4574-b14a-725c45523773"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:07:23 crc kubenswrapper[4760]: I0930 08:07:23.974108 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 08:07:23 crc kubenswrapper[4760]: I0930 08:07:23.974148 4760 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 08:07:23 crc kubenswrapper[4760]: I0930 08:07:23.974164 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 08:07:23 crc kubenswrapper[4760]: I0930 08:07:23.974179 4760 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 08:07:23 crc kubenswrapper[4760]: I0930 08:07:23.974498 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gkfx\" (UniqueName: \"kubernetes.io/projected/5259c092-63b5-4574-b14a-725c45523773-kube-api-access-5gkfx\") on node \"crc\" DevicePath \"\"" Sep 30 08:07:23 crc kubenswrapper[4760]: I0930 08:07:23.974543 4760 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5259c092-63b5-4574-b14a-725c45523773-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.271080 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4" event={"ID":"5259c092-63b5-4574-b14a-725c45523773","Type":"ContainerDied","Data":"9544840171ae54d2ef445a401883676771a3d5a91a4a4edcfa549f6cc6efd676"} Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.271124 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9544840171ae54d2ef445a401883676771a3d5a91a4a4edcfa549f6cc6efd676" Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.271253 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4" Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.422332 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s99vh"] Sep 30 08:07:24 crc kubenswrapper[4760]: E0930 08:07:24.423088 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5259c092-63b5-4574-b14a-725c45523773" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.423110 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5259c092-63b5-4574-b14a-725c45523773" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.423451 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="5259c092-63b5-4574-b14a-725c45523773" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.424268 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s99vh" Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.428751 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.429029 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8gxrf" Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.429222 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.429381 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.430003 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.434559 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s99vh"] Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.585973 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftx4r\" (UniqueName: \"kubernetes.io/projected/d904db1f-5f11-47d3-8823-ff59f4bed296-kube-api-access-ftx4r\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s99vh\" (UID: \"d904db1f-5f11-47d3-8823-ff59f4bed296\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s99vh" Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.586027 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d904db1f-5f11-47d3-8823-ff59f4bed296-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s99vh\" (UID: \"d904db1f-5f11-47d3-8823-ff59f4bed296\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s99vh" Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.586285 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d904db1f-5f11-47d3-8823-ff59f4bed296-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s99vh\" (UID: \"d904db1f-5f11-47d3-8823-ff59f4bed296\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s99vh" Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.586605 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d904db1f-5f11-47d3-8823-ff59f4bed296-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s99vh\" (UID: \"d904db1f-5f11-47d3-8823-ff59f4bed296\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s99vh" Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.586643 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d904db1f-5f11-47d3-8823-ff59f4bed296-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s99vh\" (UID: \"d904db1f-5f11-47d3-8823-ff59f4bed296\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s99vh" Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.688381 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d904db1f-5f11-47d3-8823-ff59f4bed296-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s99vh\" (UID: \"d904db1f-5f11-47d3-8823-ff59f4bed296\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s99vh" Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.688491 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d904db1f-5f11-47d3-8823-ff59f4bed296-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s99vh\" (UID: \"d904db1f-5f11-47d3-8823-ff59f4bed296\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s99vh" Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.688518 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d904db1f-5f11-47d3-8823-ff59f4bed296-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s99vh\" (UID: \"d904db1f-5f11-47d3-8823-ff59f4bed296\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s99vh" Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.688606 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftx4r\" (UniqueName: \"kubernetes.io/projected/d904db1f-5f11-47d3-8823-ff59f4bed296-kube-api-access-ftx4r\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s99vh\" (UID: \"d904db1f-5f11-47d3-8823-ff59f4bed296\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s99vh" Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.688631 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d904db1f-5f11-47d3-8823-ff59f4bed296-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s99vh\" (UID: \"d904db1f-5f11-47d3-8823-ff59f4bed296\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s99vh" Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.695425 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d904db1f-5f11-47d3-8823-ff59f4bed296-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s99vh\" (UID: \"d904db1f-5f11-47d3-8823-ff59f4bed296\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s99vh" Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.695718 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d904db1f-5f11-47d3-8823-ff59f4bed296-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s99vh\" (UID: \"d904db1f-5f11-47d3-8823-ff59f4bed296\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s99vh" Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.696714 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d904db1f-5f11-47d3-8823-ff59f4bed296-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s99vh\" (UID: \"d904db1f-5f11-47d3-8823-ff59f4bed296\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s99vh" Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.701093 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d904db1f-5f11-47d3-8823-ff59f4bed296-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s99vh\" (UID: \"d904db1f-5f11-47d3-8823-ff59f4bed296\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s99vh" Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.719347 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftx4r\" (UniqueName: \"kubernetes.io/projected/d904db1f-5f11-47d3-8823-ff59f4bed296-kube-api-access-ftx4r\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s99vh\" (UID: \"d904db1f-5f11-47d3-8823-ff59f4bed296\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s99vh" Sep 30 08:07:24 crc kubenswrapper[4760]: I0930 08:07:24.747541 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s99vh" Sep 30 08:07:25 crc kubenswrapper[4760]: I0930 08:07:25.310017 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s99vh"] Sep 30 08:07:26 crc kubenswrapper[4760]: I0930 08:07:26.296195 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s99vh" event={"ID":"d904db1f-5f11-47d3-8823-ff59f4bed296","Type":"ContainerStarted","Data":"df7be1b2580f215d4f56ad3717d815b44e24470cfe6d52b71587714e4fdc96c0"} Sep 30 08:07:26 crc kubenswrapper[4760]: I0930 08:07:26.296561 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s99vh" event={"ID":"d904db1f-5f11-47d3-8823-ff59f4bed296","Type":"ContainerStarted","Data":"adde9fb37b3f74d8009d44118f6cebf8762dd5fb7a1400de007297bc0e60ed6c"} Sep 30 08:07:26 crc kubenswrapper[4760]: I0930 08:07:26.319893 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s99vh" podStartSLOduration=1.8365457809999999 podStartE2EDuration="2.319878691s" podCreationTimestamp="2025-09-30 08:07:24 +0000 UTC" firstStartedPulling="2025-09-30 08:07:25.309369638 +0000 UTC m=+2030.952276060" lastFinishedPulling="2025-09-30 08:07:25.792702558 +0000 UTC m=+2031.435608970" observedRunningTime="2025-09-30 08:07:26.311763873 +0000 UTC m=+2031.954670295" watchObservedRunningTime="2025-09-30 08:07:26.319878691 +0000 UTC m=+2031.962785103" Sep 30 08:08:49 crc kubenswrapper[4760]: I0930 08:08:49.113578 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:08:49 crc kubenswrapper[4760]: I0930 08:08:49.114375 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:09:04 crc kubenswrapper[4760]: I0930 08:09:04.267600 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6sdhn"] Sep 30 08:09:04 crc kubenswrapper[4760]: I0930 08:09:04.270125 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6sdhn" Sep 30 08:09:04 crc kubenswrapper[4760]: I0930 08:09:04.296022 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6sdhn"] Sep 30 08:09:04 crc kubenswrapper[4760]: I0930 08:09:04.431950 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/310f917f-7a30-49c7-abc4-adee818ba58e-utilities\") pod \"redhat-marketplace-6sdhn\" (UID: \"310f917f-7a30-49c7-abc4-adee818ba58e\") " pod="openshift-marketplace/redhat-marketplace-6sdhn" Sep 30 08:09:04 crc kubenswrapper[4760]: I0930 08:09:04.432055 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz2pk\" (UniqueName: \"kubernetes.io/projected/310f917f-7a30-49c7-abc4-adee818ba58e-kube-api-access-bz2pk\") pod \"redhat-marketplace-6sdhn\" (UID: \"310f917f-7a30-49c7-abc4-adee818ba58e\") " pod="openshift-marketplace/redhat-marketplace-6sdhn" Sep 30 08:09:04 crc kubenswrapper[4760]: I0930 08:09:04.432190 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/310f917f-7a30-49c7-abc4-adee818ba58e-catalog-content\") pod \"redhat-marketplace-6sdhn\" (UID: \"310f917f-7a30-49c7-abc4-adee818ba58e\") " pod="openshift-marketplace/redhat-marketplace-6sdhn" Sep 30 08:09:04 crc kubenswrapper[4760]: I0930 08:09:04.534366 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz2pk\" (UniqueName: \"kubernetes.io/projected/310f917f-7a30-49c7-abc4-adee818ba58e-kube-api-access-bz2pk\") pod \"redhat-marketplace-6sdhn\" (UID: \"310f917f-7a30-49c7-abc4-adee818ba58e\") " pod="openshift-marketplace/redhat-marketplace-6sdhn" Sep 30 08:09:04 crc kubenswrapper[4760]: I0930 08:09:04.534495 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/310f917f-7a30-49c7-abc4-adee818ba58e-catalog-content\") pod \"redhat-marketplace-6sdhn\" (UID: \"310f917f-7a30-49c7-abc4-adee818ba58e\") " pod="openshift-marketplace/redhat-marketplace-6sdhn" Sep 30 08:09:04 crc kubenswrapper[4760]: I0930 08:09:04.534585 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/310f917f-7a30-49c7-abc4-adee818ba58e-utilities\") pod \"redhat-marketplace-6sdhn\" (UID: \"310f917f-7a30-49c7-abc4-adee818ba58e\") " pod="openshift-marketplace/redhat-marketplace-6sdhn" Sep 30 08:09:04 crc kubenswrapper[4760]: I0930 08:09:04.535060 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/310f917f-7a30-49c7-abc4-adee818ba58e-utilities\") pod \"redhat-marketplace-6sdhn\" (UID: \"310f917f-7a30-49c7-abc4-adee818ba58e\") " pod="openshift-marketplace/redhat-marketplace-6sdhn" Sep 30 08:09:04 crc kubenswrapper[4760]: I0930 08:09:04.535251 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/310f917f-7a30-49c7-abc4-adee818ba58e-catalog-content\") pod \"redhat-marketplace-6sdhn\" (UID: \"310f917f-7a30-49c7-abc4-adee818ba58e\") " pod="openshift-marketplace/redhat-marketplace-6sdhn" Sep 30 08:09:04 crc kubenswrapper[4760]: I0930 08:09:04.555722 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz2pk\" (UniqueName: \"kubernetes.io/projected/310f917f-7a30-49c7-abc4-adee818ba58e-kube-api-access-bz2pk\") pod \"redhat-marketplace-6sdhn\" (UID: \"310f917f-7a30-49c7-abc4-adee818ba58e\") " pod="openshift-marketplace/redhat-marketplace-6sdhn" Sep 30 08:09:04 crc kubenswrapper[4760]: I0930 08:09:04.599845 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6sdhn" Sep 30 08:09:05 crc kubenswrapper[4760]: I0930 08:09:05.056276 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6sdhn"] Sep 30 08:09:05 crc kubenswrapper[4760]: I0930 08:09:05.402232 4760 generic.go:334] "Generic (PLEG): container finished" podID="310f917f-7a30-49c7-abc4-adee818ba58e" containerID="b25764bfbf9ad98fa9105f490fa661bde7bea31642e2e9e715ac2b391392aa28" exitCode=0 Sep 30 08:09:05 crc kubenswrapper[4760]: I0930 08:09:05.402310 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6sdhn" event={"ID":"310f917f-7a30-49c7-abc4-adee818ba58e","Type":"ContainerDied","Data":"b25764bfbf9ad98fa9105f490fa661bde7bea31642e2e9e715ac2b391392aa28"} Sep 30 08:09:05 crc kubenswrapper[4760]: I0930 08:09:05.402370 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6sdhn" event={"ID":"310f917f-7a30-49c7-abc4-adee818ba58e","Type":"ContainerStarted","Data":"e87cfad90e57f70dbc7dd4dec5ab8393bb1b154fabdea2b3ac768ab715f3f5bd"} Sep 30 08:09:05 crc kubenswrapper[4760]: I0930 08:09:05.404479 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 08:09:06 crc kubenswrapper[4760]: I0930 08:09:06.413454 4760 generic.go:334] "Generic (PLEG): container finished" podID="310f917f-7a30-49c7-abc4-adee818ba58e" containerID="474b670cd3e3f3ba4aa97d8715ee2d8451bc901d77860a5e909cd5e4b12b42a7" exitCode=0 Sep 30 08:09:06 crc kubenswrapper[4760]: I0930 08:09:06.413507 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6sdhn" event={"ID":"310f917f-7a30-49c7-abc4-adee818ba58e","Type":"ContainerDied","Data":"474b670cd3e3f3ba4aa97d8715ee2d8451bc901d77860a5e909cd5e4b12b42a7"} Sep 30 08:09:07 crc kubenswrapper[4760]: I0930 08:09:07.425168 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6sdhn" event={"ID":"310f917f-7a30-49c7-abc4-adee818ba58e","Type":"ContainerStarted","Data":"0823787d33fdef8f85f4b6127c397e3624fbeadcabe5f7b9563eee35c7471ace"} Sep 30 08:09:07 crc kubenswrapper[4760]: I0930 08:09:07.442043 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6sdhn" podStartSLOduration=1.956291995 podStartE2EDuration="3.442020051s" podCreationTimestamp="2025-09-30 08:09:04 +0000 UTC" firstStartedPulling="2025-09-30 08:09:05.404215254 +0000 UTC m=+2131.047121666" lastFinishedPulling="2025-09-30 08:09:06.88994331 +0000 UTC m=+2132.532849722" observedRunningTime="2025-09-30 08:09:07.440108992 +0000 UTC m=+2133.083015404" watchObservedRunningTime="2025-09-30 08:09:07.442020051 +0000 UTC m=+2133.084926463" Sep 30 08:09:14 crc kubenswrapper[4760]: I0930 08:09:14.600246 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6sdhn" Sep 30 08:09:14 crc kubenswrapper[4760]: I0930 08:09:14.600894 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6sdhn" Sep 30 08:09:14 crc kubenswrapper[4760]: I0930 08:09:14.646554 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6sdhn" Sep 30 08:09:15 crc kubenswrapper[4760]: I0930 08:09:15.572199 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6sdhn" Sep 30 08:09:15 crc kubenswrapper[4760]: I0930 08:09:15.621556 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6sdhn"] Sep 30 08:09:17 crc kubenswrapper[4760]: I0930 08:09:17.532035 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6sdhn" podUID="310f917f-7a30-49c7-abc4-adee818ba58e" containerName="registry-server" containerID="cri-o://0823787d33fdef8f85f4b6127c397e3624fbeadcabe5f7b9563eee35c7471ace" gracePeriod=2 Sep 30 08:09:18 crc kubenswrapper[4760]: I0930 08:09:18.003519 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6sdhn" Sep 30 08:09:18 crc kubenswrapper[4760]: I0930 08:09:18.012100 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz2pk\" (UniqueName: \"kubernetes.io/projected/310f917f-7a30-49c7-abc4-adee818ba58e-kube-api-access-bz2pk\") pod \"310f917f-7a30-49c7-abc4-adee818ba58e\" (UID: \"310f917f-7a30-49c7-abc4-adee818ba58e\") " Sep 30 08:09:18 crc kubenswrapper[4760]: I0930 08:09:18.012231 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/310f917f-7a30-49c7-abc4-adee818ba58e-utilities\") pod \"310f917f-7a30-49c7-abc4-adee818ba58e\" (UID: \"310f917f-7a30-49c7-abc4-adee818ba58e\") " Sep 30 08:09:18 crc kubenswrapper[4760]: I0930 08:09:18.012289 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/310f917f-7a30-49c7-abc4-adee818ba58e-catalog-content\") pod \"310f917f-7a30-49c7-abc4-adee818ba58e\" (UID: \"310f917f-7a30-49c7-abc4-adee818ba58e\") " Sep 30 08:09:18 crc kubenswrapper[4760]: I0930 08:09:18.013327 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/310f917f-7a30-49c7-abc4-adee818ba58e-utilities" (OuterVolumeSpecName: "utilities") pod "310f917f-7a30-49c7-abc4-adee818ba58e" (UID: "310f917f-7a30-49c7-abc4-adee818ba58e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:09:18 crc kubenswrapper[4760]: I0930 08:09:18.018481 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/310f917f-7a30-49c7-abc4-adee818ba58e-kube-api-access-bz2pk" (OuterVolumeSpecName: "kube-api-access-bz2pk") pod "310f917f-7a30-49c7-abc4-adee818ba58e" (UID: "310f917f-7a30-49c7-abc4-adee818ba58e"). InnerVolumeSpecName "kube-api-access-bz2pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:09:18 crc kubenswrapper[4760]: I0930 08:09:18.041779 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/310f917f-7a30-49c7-abc4-adee818ba58e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "310f917f-7a30-49c7-abc4-adee818ba58e" (UID: "310f917f-7a30-49c7-abc4-adee818ba58e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:09:18 crc kubenswrapper[4760]: I0930 08:09:18.114244 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz2pk\" (UniqueName: \"kubernetes.io/projected/310f917f-7a30-49c7-abc4-adee818ba58e-kube-api-access-bz2pk\") on node \"crc\" DevicePath \"\"" Sep 30 08:09:18 crc kubenswrapper[4760]: I0930 08:09:18.114280 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/310f917f-7a30-49c7-abc4-adee818ba58e-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 08:09:18 crc kubenswrapper[4760]: I0930 08:09:18.114292 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/310f917f-7a30-49c7-abc4-adee818ba58e-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 08:09:18 crc kubenswrapper[4760]: I0930 08:09:18.542105 4760 generic.go:334] "Generic (PLEG): container finished" podID="310f917f-7a30-49c7-abc4-adee818ba58e" containerID="0823787d33fdef8f85f4b6127c397e3624fbeadcabe5f7b9563eee35c7471ace" exitCode=0 Sep 30 08:09:18 crc kubenswrapper[4760]: I0930 08:09:18.542147 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6sdhn" event={"ID":"310f917f-7a30-49c7-abc4-adee818ba58e","Type":"ContainerDied","Data":"0823787d33fdef8f85f4b6127c397e3624fbeadcabe5f7b9563eee35c7471ace"} Sep 30 08:09:18 crc kubenswrapper[4760]: I0930 08:09:18.542173 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6sdhn" event={"ID":"310f917f-7a30-49c7-abc4-adee818ba58e","Type":"ContainerDied","Data":"e87cfad90e57f70dbc7dd4dec5ab8393bb1b154fabdea2b3ac768ab715f3f5bd"} Sep 30 08:09:18 crc kubenswrapper[4760]: I0930 08:09:18.542188 4760 scope.go:117] "RemoveContainer" containerID="0823787d33fdef8f85f4b6127c397e3624fbeadcabe5f7b9563eee35c7471ace" Sep 30 08:09:18 crc kubenswrapper[4760]: I0930 08:09:18.542197 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6sdhn" Sep 30 08:09:18 crc kubenswrapper[4760]: I0930 08:09:18.561846 4760 scope.go:117] "RemoveContainer" containerID="474b670cd3e3f3ba4aa97d8715ee2d8451bc901d77860a5e909cd5e4b12b42a7" Sep 30 08:09:18 crc kubenswrapper[4760]: I0930 08:09:18.582939 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6sdhn"] Sep 30 08:09:18 crc kubenswrapper[4760]: I0930 08:09:18.596576 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6sdhn"] Sep 30 08:09:18 crc kubenswrapper[4760]: I0930 08:09:18.597648 4760 scope.go:117] "RemoveContainer" containerID="b25764bfbf9ad98fa9105f490fa661bde7bea31642e2e9e715ac2b391392aa28" Sep 30 08:09:18 crc kubenswrapper[4760]: I0930 08:09:18.650646 4760 scope.go:117] "RemoveContainer" containerID="0823787d33fdef8f85f4b6127c397e3624fbeadcabe5f7b9563eee35c7471ace" Sep 30 08:09:18 crc kubenswrapper[4760]: E0930 08:09:18.651436 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0823787d33fdef8f85f4b6127c397e3624fbeadcabe5f7b9563eee35c7471ace\": container with ID starting with 0823787d33fdef8f85f4b6127c397e3624fbeadcabe5f7b9563eee35c7471ace not found: ID does not exist" containerID="0823787d33fdef8f85f4b6127c397e3624fbeadcabe5f7b9563eee35c7471ace" Sep 30 08:09:18 crc kubenswrapper[4760]: I0930 08:09:18.651479 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0823787d33fdef8f85f4b6127c397e3624fbeadcabe5f7b9563eee35c7471ace"} err="failed to get container status \"0823787d33fdef8f85f4b6127c397e3624fbeadcabe5f7b9563eee35c7471ace\": rpc error: code = NotFound desc = could not find container \"0823787d33fdef8f85f4b6127c397e3624fbeadcabe5f7b9563eee35c7471ace\": container with ID starting with 0823787d33fdef8f85f4b6127c397e3624fbeadcabe5f7b9563eee35c7471ace not found: ID does not exist" Sep 30 08:09:18 crc kubenswrapper[4760]: I0930 08:09:18.651508 4760 scope.go:117] "RemoveContainer" containerID="474b670cd3e3f3ba4aa97d8715ee2d8451bc901d77860a5e909cd5e4b12b42a7" Sep 30 08:09:18 crc kubenswrapper[4760]: E0930 08:09:18.651777 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"474b670cd3e3f3ba4aa97d8715ee2d8451bc901d77860a5e909cd5e4b12b42a7\": container with ID starting with 474b670cd3e3f3ba4aa97d8715ee2d8451bc901d77860a5e909cd5e4b12b42a7 not found: ID does not exist" containerID="474b670cd3e3f3ba4aa97d8715ee2d8451bc901d77860a5e909cd5e4b12b42a7" Sep 30 08:09:18 crc kubenswrapper[4760]: I0930 08:09:18.651800 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"474b670cd3e3f3ba4aa97d8715ee2d8451bc901d77860a5e909cd5e4b12b42a7"} err="failed to get container status \"474b670cd3e3f3ba4aa97d8715ee2d8451bc901d77860a5e909cd5e4b12b42a7\": rpc error: code = NotFound desc = could not find container \"474b670cd3e3f3ba4aa97d8715ee2d8451bc901d77860a5e909cd5e4b12b42a7\": container with ID starting with 474b670cd3e3f3ba4aa97d8715ee2d8451bc901d77860a5e909cd5e4b12b42a7 not found: ID does not exist" Sep 30 08:09:18 crc kubenswrapper[4760]: I0930 08:09:18.651814 4760 scope.go:117] "RemoveContainer" containerID="b25764bfbf9ad98fa9105f490fa661bde7bea31642e2e9e715ac2b391392aa28" Sep 30 08:09:18 crc kubenswrapper[4760]: E0930 08:09:18.651997 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b25764bfbf9ad98fa9105f490fa661bde7bea31642e2e9e715ac2b391392aa28\": container with ID starting with b25764bfbf9ad98fa9105f490fa661bde7bea31642e2e9e715ac2b391392aa28 not found: ID does not exist" containerID="b25764bfbf9ad98fa9105f490fa661bde7bea31642e2e9e715ac2b391392aa28" Sep 30 08:09:18 crc kubenswrapper[4760]: I0930 08:09:18.652017 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b25764bfbf9ad98fa9105f490fa661bde7bea31642e2e9e715ac2b391392aa28"} err="failed to get container status \"b25764bfbf9ad98fa9105f490fa661bde7bea31642e2e9e715ac2b391392aa28\": rpc error: code = NotFound desc = could not find container \"b25764bfbf9ad98fa9105f490fa661bde7bea31642e2e9e715ac2b391392aa28\": container with ID starting with b25764bfbf9ad98fa9105f490fa661bde7bea31642e2e9e715ac2b391392aa28 not found: ID does not exist" Sep 30 08:09:19 crc kubenswrapper[4760]: I0930 08:09:19.095403 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="310f917f-7a30-49c7-abc4-adee818ba58e" path="/var/lib/kubelet/pods/310f917f-7a30-49c7-abc4-adee818ba58e/volumes" Sep 30 08:09:19 crc kubenswrapper[4760]: I0930 08:09:19.112908 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:09:19 crc kubenswrapper[4760]: I0930 08:09:19.113121 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:09:49 crc kubenswrapper[4760]: I0930 08:09:49.112915 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:09:49 crc kubenswrapper[4760]: I0930 08:09:49.113536 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:09:49 crc kubenswrapper[4760]: I0930 08:09:49.113599 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 08:09:49 crc kubenswrapper[4760]: I0930 08:09:49.114647 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b"} pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 08:09:49 crc kubenswrapper[4760]: I0930 08:09:49.114730 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" containerID="cri-o://78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" gracePeriod=600 Sep 30 08:09:49 crc kubenswrapper[4760]: E0930 08:09:49.248183 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:09:49 crc kubenswrapper[4760]: I0930 08:09:49.854962 4760 generic.go:334] "Generic (PLEG): container finished" podID="7a9c8270-6964-4886-87d0-227b05b76da4" containerID="78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" exitCode=0 Sep 30 08:09:49 crc kubenswrapper[4760]: I0930 08:09:49.855059 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerDied","Data":"78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b"} Sep 30 08:09:49 crc kubenswrapper[4760]: I0930 08:09:49.855295 4760 scope.go:117] "RemoveContainer" containerID="96fa7f3155734522f8d82f257a6244d3c4526c33face1ad5feeb1f8274d4d3e8" Sep 30 08:09:49 crc kubenswrapper[4760]: I0930 08:09:49.856397 4760 scope.go:117] "RemoveContainer" containerID="78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" Sep 30 08:09:49 crc kubenswrapper[4760]: E0930 08:09:49.857088 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:10:00 crc kubenswrapper[4760]: I0930 08:10:00.067083 4760 scope.go:117] "RemoveContainer" containerID="78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" Sep 30 08:10:00 crc kubenswrapper[4760]: E0930 08:10:00.068229 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:10:14 crc kubenswrapper[4760]: I0930 08:10:14.068085 4760 scope.go:117] "RemoveContainer" containerID="78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" Sep 30 08:10:14 crc kubenswrapper[4760]: E0930 08:10:14.069476 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:10:25 crc kubenswrapper[4760]: I0930 08:10:25.076413 4760 scope.go:117] "RemoveContainer" containerID="78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" Sep 30 08:10:25 crc kubenswrapper[4760]: E0930 08:10:25.077244 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:10:39 crc kubenswrapper[4760]: I0930 08:10:39.067833 4760 scope.go:117] "RemoveContainer" containerID="78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" Sep 30 08:10:39 crc kubenswrapper[4760]: E0930 08:10:39.068536 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:10:45 crc kubenswrapper[4760]: I0930 08:10:45.938046 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kfrjf"] Sep 30 08:10:45 crc kubenswrapper[4760]: E0930 08:10:45.939353 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="310f917f-7a30-49c7-abc4-adee818ba58e" containerName="extract-content" Sep 30 08:10:45 crc kubenswrapper[4760]: I0930 08:10:45.939376 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="310f917f-7a30-49c7-abc4-adee818ba58e" containerName="extract-content" Sep 30 08:10:45 crc kubenswrapper[4760]: E0930 08:10:45.939421 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="310f917f-7a30-49c7-abc4-adee818ba58e" containerName="registry-server" Sep 30 08:10:45 crc kubenswrapper[4760]: I0930 08:10:45.939433 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="310f917f-7a30-49c7-abc4-adee818ba58e" containerName="registry-server" Sep 30 08:10:45 crc kubenswrapper[4760]: E0930 08:10:45.939476 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="310f917f-7a30-49c7-abc4-adee818ba58e" containerName="extract-utilities" Sep 30 08:10:45 crc kubenswrapper[4760]: I0930 08:10:45.939511 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="310f917f-7a30-49c7-abc4-adee818ba58e" containerName="extract-utilities" Sep 30 08:10:45 crc kubenswrapper[4760]: I0930 08:10:45.939891 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="310f917f-7a30-49c7-abc4-adee818ba58e" containerName="registry-server" Sep 30 08:10:45 crc kubenswrapper[4760]: I0930 08:10:45.943070 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kfrjf" Sep 30 08:10:45 crc kubenswrapper[4760]: I0930 08:10:45.961369 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kfrjf"] Sep 30 08:10:46 crc kubenswrapper[4760]: I0930 08:10:46.001880 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkbsd\" (UniqueName: \"kubernetes.io/projected/d8ca5c99-bef5-40ff-9aea-f98c39652c4c-kube-api-access-jkbsd\") pod \"redhat-operators-kfrjf\" (UID: \"d8ca5c99-bef5-40ff-9aea-f98c39652c4c\") " pod="openshift-marketplace/redhat-operators-kfrjf" Sep 30 08:10:46 crc kubenswrapper[4760]: I0930 08:10:46.002043 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8ca5c99-bef5-40ff-9aea-f98c39652c4c-utilities\") pod \"redhat-operators-kfrjf\" (UID: \"d8ca5c99-bef5-40ff-9aea-f98c39652c4c\") " pod="openshift-marketplace/redhat-operators-kfrjf" Sep 30 08:10:46 crc kubenswrapper[4760]: I0930 08:10:46.002137 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8ca5c99-bef5-40ff-9aea-f98c39652c4c-catalog-content\") pod \"redhat-operators-kfrjf\" (UID: \"d8ca5c99-bef5-40ff-9aea-f98c39652c4c\") " pod="openshift-marketplace/redhat-operators-kfrjf" Sep 30 08:10:46 crc kubenswrapper[4760]: I0930 08:10:46.104126 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8ca5c99-bef5-40ff-9aea-f98c39652c4c-utilities\") pod \"redhat-operators-kfrjf\" (UID: \"d8ca5c99-bef5-40ff-9aea-f98c39652c4c\") " pod="openshift-marketplace/redhat-operators-kfrjf" Sep 30 08:10:46 crc kubenswrapper[4760]: I0930 08:10:46.104295 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8ca5c99-bef5-40ff-9aea-f98c39652c4c-catalog-content\") pod \"redhat-operators-kfrjf\" (UID: \"d8ca5c99-bef5-40ff-9aea-f98c39652c4c\") " pod="openshift-marketplace/redhat-operators-kfrjf" Sep 30 08:10:46 crc kubenswrapper[4760]: I0930 08:10:46.104379 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkbsd\" (UniqueName: \"kubernetes.io/projected/d8ca5c99-bef5-40ff-9aea-f98c39652c4c-kube-api-access-jkbsd\") pod \"redhat-operators-kfrjf\" (UID: \"d8ca5c99-bef5-40ff-9aea-f98c39652c4c\") " pod="openshift-marketplace/redhat-operators-kfrjf" Sep 30 08:10:46 crc kubenswrapper[4760]: I0930 08:10:46.105347 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8ca5c99-bef5-40ff-9aea-f98c39652c4c-utilities\") pod \"redhat-operators-kfrjf\" (UID: \"d8ca5c99-bef5-40ff-9aea-f98c39652c4c\") " pod="openshift-marketplace/redhat-operators-kfrjf" Sep 30 08:10:46 crc kubenswrapper[4760]: I0930 08:10:46.105460 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8ca5c99-bef5-40ff-9aea-f98c39652c4c-catalog-content\") pod \"redhat-operators-kfrjf\" (UID: \"d8ca5c99-bef5-40ff-9aea-f98c39652c4c\") " pod="openshift-marketplace/redhat-operators-kfrjf" Sep 30 08:10:46 crc kubenswrapper[4760]: I0930 08:10:46.124786 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkbsd\" (UniqueName: \"kubernetes.io/projected/d8ca5c99-bef5-40ff-9aea-f98c39652c4c-kube-api-access-jkbsd\") pod \"redhat-operators-kfrjf\" (UID: \"d8ca5c99-bef5-40ff-9aea-f98c39652c4c\") " pod="openshift-marketplace/redhat-operators-kfrjf" Sep 30 08:10:46 crc kubenswrapper[4760]: I0930 08:10:46.282006 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kfrjf" Sep 30 08:10:46 crc kubenswrapper[4760]: I0930 08:10:46.750584 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kfrjf"] Sep 30 08:10:47 crc kubenswrapper[4760]: I0930 08:10:47.515072 4760 generic.go:334] "Generic (PLEG): container finished" podID="d8ca5c99-bef5-40ff-9aea-f98c39652c4c" containerID="78dff69f2c45c95f34570ef4002931594c76046320cf1931bb781547bd4670e2" exitCode=0 Sep 30 08:10:47 crc kubenswrapper[4760]: I0930 08:10:47.515184 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kfrjf" event={"ID":"d8ca5c99-bef5-40ff-9aea-f98c39652c4c","Type":"ContainerDied","Data":"78dff69f2c45c95f34570ef4002931594c76046320cf1931bb781547bd4670e2"} Sep 30 08:10:47 crc kubenswrapper[4760]: I0930 08:10:47.515571 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kfrjf" event={"ID":"d8ca5c99-bef5-40ff-9aea-f98c39652c4c","Type":"ContainerStarted","Data":"997c17858c1d7e17352a37d656b00e5bd0e270517eaf3a60f2cdf61ba7e3b677"} Sep 30 08:10:48 crc kubenswrapper[4760]: I0930 08:10:48.524188 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kfrjf" event={"ID":"d8ca5c99-bef5-40ff-9aea-f98c39652c4c","Type":"ContainerStarted","Data":"a01e93001267aece2dd6c6b96730e61477b12b71f02ab77ae1f6652b1ec8ab13"} Sep 30 08:10:49 crc kubenswrapper[4760]: I0930 08:10:49.537369 4760 generic.go:334] "Generic (PLEG): container finished" podID="d8ca5c99-bef5-40ff-9aea-f98c39652c4c" containerID="a01e93001267aece2dd6c6b96730e61477b12b71f02ab77ae1f6652b1ec8ab13" exitCode=0 Sep 30 08:10:49 crc kubenswrapper[4760]: I0930 08:10:49.537413 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kfrjf" event={"ID":"d8ca5c99-bef5-40ff-9aea-f98c39652c4c","Type":"ContainerDied","Data":"a01e93001267aece2dd6c6b96730e61477b12b71f02ab77ae1f6652b1ec8ab13"} Sep 30 08:10:51 crc kubenswrapper[4760]: I0930 08:10:51.555374 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kfrjf" event={"ID":"d8ca5c99-bef5-40ff-9aea-f98c39652c4c","Type":"ContainerStarted","Data":"0a669549c740db62a66fe8214f77dac5a193560cbb53051a5b4af974889f7bba"} Sep 30 08:10:51 crc kubenswrapper[4760]: I0930 08:10:51.578208 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kfrjf" podStartSLOduration=3.598798391 podStartE2EDuration="6.578188855s" podCreationTimestamp="2025-09-30 08:10:45 +0000 UTC" firstStartedPulling="2025-09-30 08:10:47.51771696 +0000 UTC m=+2233.160623372" lastFinishedPulling="2025-09-30 08:10:50.497107424 +0000 UTC m=+2236.140013836" observedRunningTime="2025-09-30 08:10:51.577173599 +0000 UTC m=+2237.220080031" watchObservedRunningTime="2025-09-30 08:10:51.578188855 +0000 UTC m=+2237.221095267" Sep 30 08:10:52 crc kubenswrapper[4760]: I0930 08:10:52.067924 4760 scope.go:117] "RemoveContainer" containerID="78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" Sep 30 08:10:52 crc kubenswrapper[4760]: E0930 08:10:52.068465 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:10:56 crc kubenswrapper[4760]: I0930 08:10:56.283603 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kfrjf" Sep 30 08:10:56 crc kubenswrapper[4760]: I0930 08:10:56.284206 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kfrjf" Sep 30 08:10:56 crc kubenswrapper[4760]: I0930 08:10:56.342829 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kfrjf" Sep 30 08:10:56 crc kubenswrapper[4760]: I0930 08:10:56.647936 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kfrjf" Sep 30 08:10:56 crc kubenswrapper[4760]: I0930 08:10:56.698166 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kfrjf"] Sep 30 08:10:58 crc kubenswrapper[4760]: I0930 08:10:58.613342 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kfrjf" podUID="d8ca5c99-bef5-40ff-9aea-f98c39652c4c" containerName="registry-server" containerID="cri-o://0a669549c740db62a66fe8214f77dac5a193560cbb53051a5b4af974889f7bba" gracePeriod=2 Sep 30 08:10:59 crc kubenswrapper[4760]: I0930 08:10:59.086219 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kfrjf" Sep 30 08:10:59 crc kubenswrapper[4760]: I0930 08:10:59.162606 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8ca5c99-bef5-40ff-9aea-f98c39652c4c-catalog-content\") pod \"d8ca5c99-bef5-40ff-9aea-f98c39652c4c\" (UID: \"d8ca5c99-bef5-40ff-9aea-f98c39652c4c\") " Sep 30 08:10:59 crc kubenswrapper[4760]: I0930 08:10:59.162986 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8ca5c99-bef5-40ff-9aea-f98c39652c4c-utilities\") pod \"d8ca5c99-bef5-40ff-9aea-f98c39652c4c\" (UID: \"d8ca5c99-bef5-40ff-9aea-f98c39652c4c\") " Sep 30 08:10:59 crc kubenswrapper[4760]: I0930 08:10:59.163059 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkbsd\" (UniqueName: \"kubernetes.io/projected/d8ca5c99-bef5-40ff-9aea-f98c39652c4c-kube-api-access-jkbsd\") pod \"d8ca5c99-bef5-40ff-9aea-f98c39652c4c\" (UID: \"d8ca5c99-bef5-40ff-9aea-f98c39652c4c\") " Sep 30 08:10:59 crc kubenswrapper[4760]: I0930 08:10:59.163874 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8ca5c99-bef5-40ff-9aea-f98c39652c4c-utilities" (OuterVolumeSpecName: "utilities") pod "d8ca5c99-bef5-40ff-9aea-f98c39652c4c" (UID: "d8ca5c99-bef5-40ff-9aea-f98c39652c4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:10:59 crc kubenswrapper[4760]: I0930 08:10:59.168697 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8ca5c99-bef5-40ff-9aea-f98c39652c4c-kube-api-access-jkbsd" (OuterVolumeSpecName: "kube-api-access-jkbsd") pod "d8ca5c99-bef5-40ff-9aea-f98c39652c4c" (UID: "d8ca5c99-bef5-40ff-9aea-f98c39652c4c"). InnerVolumeSpecName "kube-api-access-jkbsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:10:59 crc kubenswrapper[4760]: I0930 08:10:59.251208 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8ca5c99-bef5-40ff-9aea-f98c39652c4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8ca5c99-bef5-40ff-9aea-f98c39652c4c" (UID: "d8ca5c99-bef5-40ff-9aea-f98c39652c4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:10:59 crc kubenswrapper[4760]: I0930 08:10:59.265518 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkbsd\" (UniqueName: \"kubernetes.io/projected/d8ca5c99-bef5-40ff-9aea-f98c39652c4c-kube-api-access-jkbsd\") on node \"crc\" DevicePath \"\"" Sep 30 08:10:59 crc kubenswrapper[4760]: I0930 08:10:59.265555 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8ca5c99-bef5-40ff-9aea-f98c39652c4c-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 08:10:59 crc kubenswrapper[4760]: I0930 08:10:59.265569 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8ca5c99-bef5-40ff-9aea-f98c39652c4c-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 08:10:59 crc kubenswrapper[4760]: I0930 08:10:59.629715 4760 generic.go:334] "Generic (PLEG): container finished" podID="d8ca5c99-bef5-40ff-9aea-f98c39652c4c" containerID="0a669549c740db62a66fe8214f77dac5a193560cbb53051a5b4af974889f7bba" exitCode=0 Sep 30 08:10:59 crc kubenswrapper[4760]: I0930 08:10:59.629765 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kfrjf" event={"ID":"d8ca5c99-bef5-40ff-9aea-f98c39652c4c","Type":"ContainerDied","Data":"0a669549c740db62a66fe8214f77dac5a193560cbb53051a5b4af974889f7bba"} Sep 30 08:10:59 crc kubenswrapper[4760]: I0930 08:10:59.629797 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kfrjf" event={"ID":"d8ca5c99-bef5-40ff-9aea-f98c39652c4c","Type":"ContainerDied","Data":"997c17858c1d7e17352a37d656b00e5bd0e270517eaf3a60f2cdf61ba7e3b677"} Sep 30 08:10:59 crc kubenswrapper[4760]: I0930 08:10:59.629811 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kfrjf" Sep 30 08:10:59 crc kubenswrapper[4760]: I0930 08:10:59.629820 4760 scope.go:117] "RemoveContainer" containerID="0a669549c740db62a66fe8214f77dac5a193560cbb53051a5b4af974889f7bba" Sep 30 08:10:59 crc kubenswrapper[4760]: I0930 08:10:59.666152 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kfrjf"] Sep 30 08:10:59 crc kubenswrapper[4760]: I0930 08:10:59.676043 4760 scope.go:117] "RemoveContainer" containerID="a01e93001267aece2dd6c6b96730e61477b12b71f02ab77ae1f6652b1ec8ab13" Sep 30 08:10:59 crc kubenswrapper[4760]: I0930 08:10:59.680886 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kfrjf"] Sep 30 08:10:59 crc kubenswrapper[4760]: I0930 08:10:59.696245 4760 scope.go:117] "RemoveContainer" containerID="78dff69f2c45c95f34570ef4002931594c76046320cf1931bb781547bd4670e2" Sep 30 08:10:59 crc kubenswrapper[4760]: I0930 08:10:59.743354 4760 scope.go:117] "RemoveContainer" containerID="0a669549c740db62a66fe8214f77dac5a193560cbb53051a5b4af974889f7bba" Sep 30 08:10:59 crc kubenswrapper[4760]: E0930 08:10:59.745879 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a669549c740db62a66fe8214f77dac5a193560cbb53051a5b4af974889f7bba\": container with ID starting with 0a669549c740db62a66fe8214f77dac5a193560cbb53051a5b4af974889f7bba not found: ID does not exist" containerID="0a669549c740db62a66fe8214f77dac5a193560cbb53051a5b4af974889f7bba" Sep 30 08:10:59 crc kubenswrapper[4760]: I0930 08:10:59.745941 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a669549c740db62a66fe8214f77dac5a193560cbb53051a5b4af974889f7bba"} err="failed to get container status \"0a669549c740db62a66fe8214f77dac5a193560cbb53051a5b4af974889f7bba\": rpc error: code = NotFound desc = could not find container \"0a669549c740db62a66fe8214f77dac5a193560cbb53051a5b4af974889f7bba\": container with ID starting with 0a669549c740db62a66fe8214f77dac5a193560cbb53051a5b4af974889f7bba not found: ID does not exist" Sep 30 08:10:59 crc kubenswrapper[4760]: I0930 08:10:59.745970 4760 scope.go:117] "RemoveContainer" containerID="a01e93001267aece2dd6c6b96730e61477b12b71f02ab77ae1f6652b1ec8ab13" Sep 30 08:10:59 crc kubenswrapper[4760]: E0930 08:10:59.746715 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a01e93001267aece2dd6c6b96730e61477b12b71f02ab77ae1f6652b1ec8ab13\": container with ID starting with a01e93001267aece2dd6c6b96730e61477b12b71f02ab77ae1f6652b1ec8ab13 not found: ID does not exist" containerID="a01e93001267aece2dd6c6b96730e61477b12b71f02ab77ae1f6652b1ec8ab13" Sep 30 08:10:59 crc kubenswrapper[4760]: I0930 08:10:59.746774 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a01e93001267aece2dd6c6b96730e61477b12b71f02ab77ae1f6652b1ec8ab13"} err="failed to get container status \"a01e93001267aece2dd6c6b96730e61477b12b71f02ab77ae1f6652b1ec8ab13\": rpc error: code = NotFound desc = could not find container \"a01e93001267aece2dd6c6b96730e61477b12b71f02ab77ae1f6652b1ec8ab13\": container with ID starting with a01e93001267aece2dd6c6b96730e61477b12b71f02ab77ae1f6652b1ec8ab13 not found: ID does not exist" Sep 30 08:10:59 crc kubenswrapper[4760]: I0930 08:10:59.746811 4760 scope.go:117] "RemoveContainer" containerID="78dff69f2c45c95f34570ef4002931594c76046320cf1931bb781547bd4670e2" Sep 30 08:10:59 crc kubenswrapper[4760]: E0930 08:10:59.747278 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78dff69f2c45c95f34570ef4002931594c76046320cf1931bb781547bd4670e2\": container with ID starting with 78dff69f2c45c95f34570ef4002931594c76046320cf1931bb781547bd4670e2 not found: ID does not exist" containerID="78dff69f2c45c95f34570ef4002931594c76046320cf1931bb781547bd4670e2" Sep 30 08:10:59 crc kubenswrapper[4760]: I0930 08:10:59.747394 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78dff69f2c45c95f34570ef4002931594c76046320cf1931bb781547bd4670e2"} err="failed to get container status \"78dff69f2c45c95f34570ef4002931594c76046320cf1931bb781547bd4670e2\": rpc error: code = NotFound desc = could not find container \"78dff69f2c45c95f34570ef4002931594c76046320cf1931bb781547bd4670e2\": container with ID starting with 78dff69f2c45c95f34570ef4002931594c76046320cf1931bb781547bd4670e2 not found: ID does not exist" Sep 30 08:11:01 crc kubenswrapper[4760]: I0930 08:11:01.079394 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8ca5c99-bef5-40ff-9aea-f98c39652c4c" path="/var/lib/kubelet/pods/d8ca5c99-bef5-40ff-9aea-f98c39652c4c/volumes" Sep 30 08:11:07 crc kubenswrapper[4760]: I0930 08:11:07.068314 4760 scope.go:117] "RemoveContainer" containerID="78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" Sep 30 08:11:07 crc kubenswrapper[4760]: E0930 08:11:07.069367 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:11:19 crc kubenswrapper[4760]: I0930 08:11:19.069297 4760 scope.go:117] "RemoveContainer" containerID="78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" Sep 30 08:11:19 crc kubenswrapper[4760]: E0930 08:11:19.070613 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:11:34 crc kubenswrapper[4760]: I0930 08:11:34.067467 4760 scope.go:117] "RemoveContainer" containerID="78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" Sep 30 08:11:34 crc kubenswrapper[4760]: E0930 08:11:34.068133 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:11:46 crc kubenswrapper[4760]: I0930 08:11:46.068031 4760 scope.go:117] "RemoveContainer" containerID="78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" Sep 30 08:11:46 crc kubenswrapper[4760]: E0930 08:11:46.069682 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:12:01 crc kubenswrapper[4760]: I0930 08:12:01.069002 4760 scope.go:117] "RemoveContainer" containerID="78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" Sep 30 08:12:01 crc kubenswrapper[4760]: E0930 08:12:01.070232 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:12:08 crc kubenswrapper[4760]: I0930 08:12:08.392290 4760 generic.go:334] "Generic (PLEG): container finished" podID="d904db1f-5f11-47d3-8823-ff59f4bed296" containerID="df7be1b2580f215d4f56ad3717d815b44e24470cfe6d52b71587714e4fdc96c0" exitCode=0 Sep 30 08:12:08 crc kubenswrapper[4760]: I0930 08:12:08.392362 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s99vh" event={"ID":"d904db1f-5f11-47d3-8823-ff59f4bed296","Type":"ContainerDied","Data":"df7be1b2580f215d4f56ad3717d815b44e24470cfe6d52b71587714e4fdc96c0"} Sep 30 08:12:09 crc kubenswrapper[4760]: I0930 08:12:09.867237 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s99vh" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.012285 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d904db1f-5f11-47d3-8823-ff59f4bed296-inventory\") pod \"d904db1f-5f11-47d3-8823-ff59f4bed296\" (UID: \"d904db1f-5f11-47d3-8823-ff59f4bed296\") " Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.012724 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d904db1f-5f11-47d3-8823-ff59f4bed296-libvirt-secret-0\") pod \"d904db1f-5f11-47d3-8823-ff59f4bed296\" (UID: \"d904db1f-5f11-47d3-8823-ff59f4bed296\") " Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.012762 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d904db1f-5f11-47d3-8823-ff59f4bed296-ssh-key\") pod \"d904db1f-5f11-47d3-8823-ff59f4bed296\" (UID: \"d904db1f-5f11-47d3-8823-ff59f4bed296\") " Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.012935 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d904db1f-5f11-47d3-8823-ff59f4bed296-libvirt-combined-ca-bundle\") pod \"d904db1f-5f11-47d3-8823-ff59f4bed296\" (UID: \"d904db1f-5f11-47d3-8823-ff59f4bed296\") " Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.014092 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftx4r\" (UniqueName: \"kubernetes.io/projected/d904db1f-5f11-47d3-8823-ff59f4bed296-kube-api-access-ftx4r\") pod \"d904db1f-5f11-47d3-8823-ff59f4bed296\" (UID: \"d904db1f-5f11-47d3-8823-ff59f4bed296\") " Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.025667 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d904db1f-5f11-47d3-8823-ff59f4bed296-kube-api-access-ftx4r" (OuterVolumeSpecName: "kube-api-access-ftx4r") pod "d904db1f-5f11-47d3-8823-ff59f4bed296" (UID: "d904db1f-5f11-47d3-8823-ff59f4bed296"). InnerVolumeSpecName "kube-api-access-ftx4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.035433 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d904db1f-5f11-47d3-8823-ff59f4bed296-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "d904db1f-5f11-47d3-8823-ff59f4bed296" (UID: "d904db1f-5f11-47d3-8823-ff59f4bed296"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.055502 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d904db1f-5f11-47d3-8823-ff59f4bed296-inventory" (OuterVolumeSpecName: "inventory") pod "d904db1f-5f11-47d3-8823-ff59f4bed296" (UID: "d904db1f-5f11-47d3-8823-ff59f4bed296"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.062499 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d904db1f-5f11-47d3-8823-ff59f4bed296-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d904db1f-5f11-47d3-8823-ff59f4bed296" (UID: "d904db1f-5f11-47d3-8823-ff59f4bed296"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.065174 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d904db1f-5f11-47d3-8823-ff59f4bed296-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "d904db1f-5f11-47d3-8823-ff59f4bed296" (UID: "d904db1f-5f11-47d3-8823-ff59f4bed296"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.116826 4760 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d904db1f-5f11-47d3-8823-ff59f4bed296-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.116865 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftx4r\" (UniqueName: \"kubernetes.io/projected/d904db1f-5f11-47d3-8823-ff59f4bed296-kube-api-access-ftx4r\") on node \"crc\" DevicePath \"\"" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.116878 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d904db1f-5f11-47d3-8823-ff59f4bed296-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.116891 4760 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d904db1f-5f11-47d3-8823-ff59f4bed296-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.116902 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d904db1f-5f11-47d3-8823-ff59f4bed296-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.417911 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s99vh" event={"ID":"d904db1f-5f11-47d3-8823-ff59f4bed296","Type":"ContainerDied","Data":"adde9fb37b3f74d8009d44118f6cebf8762dd5fb7a1400de007297bc0e60ed6c"} Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.417982 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s99vh" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.417989 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adde9fb37b3f74d8009d44118f6cebf8762dd5fb7a1400de007297bc0e60ed6c" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.543586 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd"] Sep 30 08:12:10 crc kubenswrapper[4760]: E0930 08:12:10.544326 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ca5c99-bef5-40ff-9aea-f98c39652c4c" containerName="extract-content" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.544356 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ca5c99-bef5-40ff-9aea-f98c39652c4c" containerName="extract-content" Sep 30 08:12:10 crc kubenswrapper[4760]: E0930 08:12:10.544391 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ca5c99-bef5-40ff-9aea-f98c39652c4c" containerName="registry-server" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.544405 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ca5c99-bef5-40ff-9aea-f98c39652c4c" containerName="registry-server" Sep 30 08:12:10 crc kubenswrapper[4760]: E0930 08:12:10.544443 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d904db1f-5f11-47d3-8823-ff59f4bed296" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.544458 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d904db1f-5f11-47d3-8823-ff59f4bed296" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 30 08:12:10 crc kubenswrapper[4760]: E0930 08:12:10.544484 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ca5c99-bef5-40ff-9aea-f98c39652c4c" containerName="extract-utilities" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.544497 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ca5c99-bef5-40ff-9aea-f98c39652c4c" containerName="extract-utilities" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.544839 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="d904db1f-5f11-47d3-8823-ff59f4bed296" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.544888 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8ca5c99-bef5-40ff-9aea-f98c39652c4c" containerName="registry-server" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.546049 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.548223 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.548462 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8gxrf" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.549492 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.549706 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.550389 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.550702 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.550778 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.589447 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd"] Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.730022 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5ncgd\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.730389 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5ncgd\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.730500 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5ncgd\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.730624 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5ncgd\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.730733 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5ncgd\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.730809 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5ncgd\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.730930 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cts9\" (UniqueName: \"kubernetes.io/projected/33cc4d6c-b086-410c-b38e-f6c918657a74-kube-api-access-5cts9\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5ncgd\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.731023 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5ncgd\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.731111 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5ncgd\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.833364 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5ncgd\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.833473 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5ncgd\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.833538 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5ncgd\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.833663 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5ncgd\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.833787 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5ncgd\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.833849 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5ncgd\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.834009 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cts9\" (UniqueName: \"kubernetes.io/projected/33cc4d6c-b086-410c-b38e-f6c918657a74-kube-api-access-5cts9\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5ncgd\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.834100 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5ncgd\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.834166 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5ncgd\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.835287 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5ncgd\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.842760 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5ncgd\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.842956 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5ncgd\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.843497 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5ncgd\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.848492 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5ncgd\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.849214 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5ncgd\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.849797 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5ncgd\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.850241 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5ncgd\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.860856 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cts9\" (UniqueName: \"kubernetes.io/projected/33cc4d6c-b086-410c-b38e-f6c918657a74-kube-api-access-5cts9\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5ncgd\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:10 crc kubenswrapper[4760]: I0930 08:12:10.872491 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:12:11 crc kubenswrapper[4760]: I0930 08:12:11.509771 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd"] Sep 30 08:12:12 crc kubenswrapper[4760]: I0930 08:12:12.440785 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" event={"ID":"33cc4d6c-b086-410c-b38e-f6c918657a74","Type":"ContainerStarted","Data":"c9bf51b5a8f437438e4c9920e49205135a53165a7dfdd2343f200c36ba338b7c"} Sep 30 08:12:12 crc kubenswrapper[4760]: I0930 08:12:12.441154 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" event={"ID":"33cc4d6c-b086-410c-b38e-f6c918657a74","Type":"ContainerStarted","Data":"b3293324ff185099a460f9a45019748f079796dec06393b8682a5c6b935c870e"} Sep 30 08:12:12 crc kubenswrapper[4760]: I0930 08:12:12.475253 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" podStartSLOduration=1.981742857 podStartE2EDuration="2.475227544s" podCreationTimestamp="2025-09-30 08:12:10 +0000 UTC" firstStartedPulling="2025-09-30 08:12:11.517668832 +0000 UTC m=+2317.160575234" lastFinishedPulling="2025-09-30 08:12:12.011153509 +0000 UTC m=+2317.654059921" observedRunningTime="2025-09-30 08:12:12.462434317 +0000 UTC m=+2318.105340759" watchObservedRunningTime="2025-09-30 08:12:12.475227544 +0000 UTC m=+2318.118133986" Sep 30 08:12:13 crc kubenswrapper[4760]: I0930 08:12:13.067416 4760 scope.go:117] "RemoveContainer" containerID="78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" Sep 30 08:12:13 crc kubenswrapper[4760]: E0930 08:12:13.067990 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:12:27 crc kubenswrapper[4760]: I0930 08:12:27.068746 4760 scope.go:117] "RemoveContainer" containerID="78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" Sep 30 08:12:27 crc kubenswrapper[4760]: E0930 08:12:27.070765 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:12:39 crc kubenswrapper[4760]: I0930 08:12:39.067356 4760 scope.go:117] "RemoveContainer" containerID="78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" Sep 30 08:12:39 crc kubenswrapper[4760]: E0930 08:12:39.068110 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:12:50 crc kubenswrapper[4760]: I0930 08:12:50.068886 4760 scope.go:117] "RemoveContainer" containerID="78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" Sep 30 08:12:50 crc kubenswrapper[4760]: E0930 08:12:50.069795 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:12:59 crc kubenswrapper[4760]: I0930 08:12:59.212710 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2kgrs"] Sep 30 08:12:59 crc kubenswrapper[4760]: I0930 08:12:59.215181 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2kgrs" Sep 30 08:12:59 crc kubenswrapper[4760]: I0930 08:12:59.226402 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2kgrs"] Sep 30 08:12:59 crc kubenswrapper[4760]: I0930 08:12:59.369906 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05166598-7d6f-485c-8700-7a84a5261e1b-utilities\") pod \"certified-operators-2kgrs\" (UID: \"05166598-7d6f-485c-8700-7a84a5261e1b\") " pod="openshift-marketplace/certified-operators-2kgrs" Sep 30 08:12:59 crc kubenswrapper[4760]: I0930 08:12:59.370035 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05166598-7d6f-485c-8700-7a84a5261e1b-catalog-content\") pod \"certified-operators-2kgrs\" (UID: \"05166598-7d6f-485c-8700-7a84a5261e1b\") " pod="openshift-marketplace/certified-operators-2kgrs" Sep 30 08:12:59 crc kubenswrapper[4760]: I0930 08:12:59.370116 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtb4k\" (UniqueName: \"kubernetes.io/projected/05166598-7d6f-485c-8700-7a84a5261e1b-kube-api-access-jtb4k\") pod \"certified-operators-2kgrs\" (UID: \"05166598-7d6f-485c-8700-7a84a5261e1b\") " pod="openshift-marketplace/certified-operators-2kgrs" Sep 30 08:12:59 crc kubenswrapper[4760]: I0930 08:12:59.410665 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4phwd"] Sep 30 08:12:59 crc kubenswrapper[4760]: I0930 08:12:59.412649 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4phwd" Sep 30 08:12:59 crc kubenswrapper[4760]: I0930 08:12:59.424392 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4phwd"] Sep 30 08:12:59 crc kubenswrapper[4760]: I0930 08:12:59.471886 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtb4k\" (UniqueName: \"kubernetes.io/projected/05166598-7d6f-485c-8700-7a84a5261e1b-kube-api-access-jtb4k\") pod \"certified-operators-2kgrs\" (UID: \"05166598-7d6f-485c-8700-7a84a5261e1b\") " pod="openshift-marketplace/certified-operators-2kgrs" Sep 30 08:12:59 crc kubenswrapper[4760]: I0930 08:12:59.471998 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05166598-7d6f-485c-8700-7a84a5261e1b-utilities\") pod \"certified-operators-2kgrs\" (UID: \"05166598-7d6f-485c-8700-7a84a5261e1b\") " pod="openshift-marketplace/certified-operators-2kgrs" Sep 30 08:12:59 crc kubenswrapper[4760]: I0930 08:12:59.472076 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05166598-7d6f-485c-8700-7a84a5261e1b-catalog-content\") pod \"certified-operators-2kgrs\" (UID: \"05166598-7d6f-485c-8700-7a84a5261e1b\") " pod="openshift-marketplace/certified-operators-2kgrs" Sep 30 08:12:59 crc kubenswrapper[4760]: I0930 08:12:59.472599 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05166598-7d6f-485c-8700-7a84a5261e1b-utilities\") pod \"certified-operators-2kgrs\" (UID: \"05166598-7d6f-485c-8700-7a84a5261e1b\") " pod="openshift-marketplace/certified-operators-2kgrs" Sep 30 08:12:59 crc kubenswrapper[4760]: I0930 08:12:59.472657 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05166598-7d6f-485c-8700-7a84a5261e1b-catalog-content\") pod \"certified-operators-2kgrs\" (UID: \"05166598-7d6f-485c-8700-7a84a5261e1b\") " pod="openshift-marketplace/certified-operators-2kgrs" Sep 30 08:12:59 crc kubenswrapper[4760]: I0930 08:12:59.503012 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtb4k\" (UniqueName: \"kubernetes.io/projected/05166598-7d6f-485c-8700-7a84a5261e1b-kube-api-access-jtb4k\") pod \"certified-operators-2kgrs\" (UID: \"05166598-7d6f-485c-8700-7a84a5261e1b\") " pod="openshift-marketplace/certified-operators-2kgrs" Sep 30 08:12:59 crc kubenswrapper[4760]: I0930 08:12:59.535946 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2kgrs" Sep 30 08:12:59 crc kubenswrapper[4760]: I0930 08:12:59.573461 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfc1fa61-9e5f-4fdc-8e83-cebb7344919b-catalog-content\") pod \"community-operators-4phwd\" (UID: \"cfc1fa61-9e5f-4fdc-8e83-cebb7344919b\") " pod="openshift-marketplace/community-operators-4phwd" Sep 30 08:12:59 crc kubenswrapper[4760]: I0930 08:12:59.573845 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q67ks\" (UniqueName: \"kubernetes.io/projected/cfc1fa61-9e5f-4fdc-8e83-cebb7344919b-kube-api-access-q67ks\") pod \"community-operators-4phwd\" (UID: \"cfc1fa61-9e5f-4fdc-8e83-cebb7344919b\") " pod="openshift-marketplace/community-operators-4phwd" Sep 30 08:12:59 crc kubenswrapper[4760]: I0930 08:12:59.573923 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfc1fa61-9e5f-4fdc-8e83-cebb7344919b-utilities\") pod \"community-operators-4phwd\" (UID: \"cfc1fa61-9e5f-4fdc-8e83-cebb7344919b\") " pod="openshift-marketplace/community-operators-4phwd" Sep 30 08:12:59 crc kubenswrapper[4760]: I0930 08:12:59.676644 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q67ks\" (UniqueName: \"kubernetes.io/projected/cfc1fa61-9e5f-4fdc-8e83-cebb7344919b-kube-api-access-q67ks\") pod \"community-operators-4phwd\" (UID: \"cfc1fa61-9e5f-4fdc-8e83-cebb7344919b\") " pod="openshift-marketplace/community-operators-4phwd" Sep 30 08:12:59 crc kubenswrapper[4760]: I0930 08:12:59.676753 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfc1fa61-9e5f-4fdc-8e83-cebb7344919b-utilities\") pod \"community-operators-4phwd\" (UID: \"cfc1fa61-9e5f-4fdc-8e83-cebb7344919b\") " pod="openshift-marketplace/community-operators-4phwd" Sep 30 08:12:59 crc kubenswrapper[4760]: I0930 08:12:59.676836 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfc1fa61-9e5f-4fdc-8e83-cebb7344919b-catalog-content\") pod \"community-operators-4phwd\" (UID: \"cfc1fa61-9e5f-4fdc-8e83-cebb7344919b\") " pod="openshift-marketplace/community-operators-4phwd" Sep 30 08:12:59 crc kubenswrapper[4760]: I0930 08:12:59.677600 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfc1fa61-9e5f-4fdc-8e83-cebb7344919b-catalog-content\") pod \"community-operators-4phwd\" (UID: \"cfc1fa61-9e5f-4fdc-8e83-cebb7344919b\") " pod="openshift-marketplace/community-operators-4phwd" Sep 30 08:12:59 crc kubenswrapper[4760]: I0930 08:12:59.677683 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfc1fa61-9e5f-4fdc-8e83-cebb7344919b-utilities\") pod \"community-operators-4phwd\" (UID: \"cfc1fa61-9e5f-4fdc-8e83-cebb7344919b\") " pod="openshift-marketplace/community-operators-4phwd" Sep 30 08:12:59 crc kubenswrapper[4760]: I0930 08:12:59.705145 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q67ks\" (UniqueName: \"kubernetes.io/projected/cfc1fa61-9e5f-4fdc-8e83-cebb7344919b-kube-api-access-q67ks\") pod \"community-operators-4phwd\" (UID: \"cfc1fa61-9e5f-4fdc-8e83-cebb7344919b\") " pod="openshift-marketplace/community-operators-4phwd" Sep 30 08:12:59 crc kubenswrapper[4760]: I0930 08:12:59.734215 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4phwd" Sep 30 08:13:00 crc kubenswrapper[4760]: I0930 08:13:00.071101 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2kgrs"] Sep 30 08:13:00 crc kubenswrapper[4760]: I0930 08:13:00.272070 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4phwd"] Sep 30 08:13:00 crc kubenswrapper[4760]: I0930 08:13:00.985154 4760 generic.go:334] "Generic (PLEG): container finished" podID="cfc1fa61-9e5f-4fdc-8e83-cebb7344919b" containerID="0570bf5a1f61fb546f688391a613937ab48da49b5432050de9a40e4b5a3f888e" exitCode=0 Sep 30 08:13:00 crc kubenswrapper[4760]: I0930 08:13:00.985230 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4phwd" event={"ID":"cfc1fa61-9e5f-4fdc-8e83-cebb7344919b","Type":"ContainerDied","Data":"0570bf5a1f61fb546f688391a613937ab48da49b5432050de9a40e4b5a3f888e"} Sep 30 08:13:00 crc kubenswrapper[4760]: I0930 08:13:00.985662 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4phwd" event={"ID":"cfc1fa61-9e5f-4fdc-8e83-cebb7344919b","Type":"ContainerStarted","Data":"2b54f38e7ec07e855cda15b5869f07c572643a508b7346ae5181318a92e52ed1"} Sep 30 08:13:00 crc kubenswrapper[4760]: I0930 08:13:00.987663 4760 generic.go:334] "Generic (PLEG): container finished" podID="05166598-7d6f-485c-8700-7a84a5261e1b" containerID="fed1a350592b51a7b4484ab857bc028c5b88dc62d9f32ce12bd1361c6a4bfc87" exitCode=0 Sep 30 08:13:00 crc kubenswrapper[4760]: I0930 08:13:00.987706 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2kgrs" event={"ID":"05166598-7d6f-485c-8700-7a84a5261e1b","Type":"ContainerDied","Data":"fed1a350592b51a7b4484ab857bc028c5b88dc62d9f32ce12bd1361c6a4bfc87"} Sep 30 08:13:00 crc kubenswrapper[4760]: I0930 08:13:00.987743 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2kgrs" event={"ID":"05166598-7d6f-485c-8700-7a84a5261e1b","Type":"ContainerStarted","Data":"db8862c806b41d442670329ba70c8b6a0f85d2e7bdcf98f86cc8075db24014a5"} Sep 30 08:13:01 crc kubenswrapper[4760]: I0930 08:13:01.998576 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4phwd" event={"ID":"cfc1fa61-9e5f-4fdc-8e83-cebb7344919b","Type":"ContainerStarted","Data":"8e9fbdf3546b80c496acad0592dfb715547b468fbdb8902d15dd623a07cd537e"} Sep 30 08:13:02 crc kubenswrapper[4760]: I0930 08:13:02.001011 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2kgrs" event={"ID":"05166598-7d6f-485c-8700-7a84a5261e1b","Type":"ContainerStarted","Data":"84a991fb4c854c061295d6f14bf7e1a4aff85caa05d013cdd068ec903350716a"} Sep 30 08:13:03 crc kubenswrapper[4760]: I0930 08:13:03.014442 4760 generic.go:334] "Generic (PLEG): container finished" podID="cfc1fa61-9e5f-4fdc-8e83-cebb7344919b" containerID="8e9fbdf3546b80c496acad0592dfb715547b468fbdb8902d15dd623a07cd537e" exitCode=0 Sep 30 08:13:03 crc kubenswrapper[4760]: I0930 08:13:03.014608 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4phwd" event={"ID":"cfc1fa61-9e5f-4fdc-8e83-cebb7344919b","Type":"ContainerDied","Data":"8e9fbdf3546b80c496acad0592dfb715547b468fbdb8902d15dd623a07cd537e"} Sep 30 08:13:03 crc kubenswrapper[4760]: I0930 08:13:03.019007 4760 generic.go:334] "Generic (PLEG): container finished" podID="05166598-7d6f-485c-8700-7a84a5261e1b" containerID="84a991fb4c854c061295d6f14bf7e1a4aff85caa05d013cdd068ec903350716a" exitCode=0 Sep 30 08:13:03 crc kubenswrapper[4760]: I0930 08:13:03.019069 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2kgrs" event={"ID":"05166598-7d6f-485c-8700-7a84a5261e1b","Type":"ContainerDied","Data":"84a991fb4c854c061295d6f14bf7e1a4aff85caa05d013cdd068ec903350716a"} Sep 30 08:13:03 crc kubenswrapper[4760]: I0930 08:13:03.067566 4760 scope.go:117] "RemoveContainer" containerID="78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" Sep 30 08:13:03 crc kubenswrapper[4760]: E0930 08:13:03.067879 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:13:04 crc kubenswrapper[4760]: I0930 08:13:04.030233 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2kgrs" event={"ID":"05166598-7d6f-485c-8700-7a84a5261e1b","Type":"ContainerStarted","Data":"020cad18324e312292efd29a0c12b5a6eb7472217af1bd842d619bc3d63e3214"} Sep 30 08:13:04 crc kubenswrapper[4760]: I0930 08:13:04.032878 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4phwd" event={"ID":"cfc1fa61-9e5f-4fdc-8e83-cebb7344919b","Type":"ContainerStarted","Data":"61f3a9d29d1293e544bdf1ceabc56e35fbba91814339c1d874904c2536c93c6e"} Sep 30 08:13:04 crc kubenswrapper[4760]: I0930 08:13:04.065967 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2kgrs" podStartSLOduration=2.61868495 podStartE2EDuration="5.06594294s" podCreationTimestamp="2025-09-30 08:12:59 +0000 UTC" firstStartedPulling="2025-09-30 08:13:00.990069278 +0000 UTC m=+2366.632975730" lastFinishedPulling="2025-09-30 08:13:03.437327298 +0000 UTC m=+2369.080233720" observedRunningTime="2025-09-30 08:13:04.054786025 +0000 UTC m=+2369.697692447" watchObservedRunningTime="2025-09-30 08:13:04.06594294 +0000 UTC m=+2369.708849342" Sep 30 08:13:04 crc kubenswrapper[4760]: I0930 08:13:04.077758 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4phwd" podStartSLOduration=2.557586128 podStartE2EDuration="5.077733721s" podCreationTimestamp="2025-09-30 08:12:59 +0000 UTC" firstStartedPulling="2025-09-30 08:13:00.988333624 +0000 UTC m=+2366.631240036" lastFinishedPulling="2025-09-30 08:13:03.508481207 +0000 UTC m=+2369.151387629" observedRunningTime="2025-09-30 08:13:04.074062587 +0000 UTC m=+2369.716969019" watchObservedRunningTime="2025-09-30 08:13:04.077733721 +0000 UTC m=+2369.720640133" Sep 30 08:13:09 crc kubenswrapper[4760]: I0930 08:13:09.536238 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2kgrs" Sep 30 08:13:09 crc kubenswrapper[4760]: I0930 08:13:09.536903 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2kgrs" Sep 30 08:13:09 crc kubenswrapper[4760]: I0930 08:13:09.626661 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2kgrs" Sep 30 08:13:09 crc kubenswrapper[4760]: I0930 08:13:09.736147 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4phwd" Sep 30 08:13:09 crc kubenswrapper[4760]: I0930 08:13:09.736198 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4phwd" Sep 30 08:13:09 crc kubenswrapper[4760]: I0930 08:13:09.823193 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4phwd" Sep 30 08:13:10 crc kubenswrapper[4760]: I0930 08:13:10.180209 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2kgrs" Sep 30 08:13:10 crc kubenswrapper[4760]: I0930 08:13:10.191293 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4phwd" Sep 30 08:13:11 crc kubenswrapper[4760]: I0930 08:13:11.599782 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2kgrs"] Sep 30 08:13:12 crc kubenswrapper[4760]: I0930 08:13:12.125222 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2kgrs" podUID="05166598-7d6f-485c-8700-7a84a5261e1b" containerName="registry-server" containerID="cri-o://020cad18324e312292efd29a0c12b5a6eb7472217af1bd842d619bc3d63e3214" gracePeriod=2 Sep 30 08:13:12 crc kubenswrapper[4760]: I0930 08:13:12.601802 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4phwd"] Sep 30 08:13:12 crc kubenswrapper[4760]: I0930 08:13:12.602667 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4phwd" podUID="cfc1fa61-9e5f-4fdc-8e83-cebb7344919b" containerName="registry-server" containerID="cri-o://61f3a9d29d1293e544bdf1ceabc56e35fbba91814339c1d874904c2536c93c6e" gracePeriod=2 Sep 30 08:13:12 crc kubenswrapper[4760]: I0930 08:13:12.717083 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2kgrs" Sep 30 08:13:12 crc kubenswrapper[4760]: I0930 08:13:12.881598 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05166598-7d6f-485c-8700-7a84a5261e1b-catalog-content\") pod \"05166598-7d6f-485c-8700-7a84a5261e1b\" (UID: \"05166598-7d6f-485c-8700-7a84a5261e1b\") " Sep 30 08:13:12 crc kubenswrapper[4760]: I0930 08:13:12.881683 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05166598-7d6f-485c-8700-7a84a5261e1b-utilities\") pod \"05166598-7d6f-485c-8700-7a84a5261e1b\" (UID: \"05166598-7d6f-485c-8700-7a84a5261e1b\") " Sep 30 08:13:12 crc kubenswrapper[4760]: I0930 08:13:12.881935 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtb4k\" (UniqueName: \"kubernetes.io/projected/05166598-7d6f-485c-8700-7a84a5261e1b-kube-api-access-jtb4k\") pod \"05166598-7d6f-485c-8700-7a84a5261e1b\" (UID: \"05166598-7d6f-485c-8700-7a84a5261e1b\") " Sep 30 08:13:12 crc kubenswrapper[4760]: I0930 08:13:12.882939 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05166598-7d6f-485c-8700-7a84a5261e1b-utilities" (OuterVolumeSpecName: "utilities") pod "05166598-7d6f-485c-8700-7a84a5261e1b" (UID: "05166598-7d6f-485c-8700-7a84a5261e1b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:13:12 crc kubenswrapper[4760]: I0930 08:13:12.897782 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05166598-7d6f-485c-8700-7a84a5261e1b-kube-api-access-jtb4k" (OuterVolumeSpecName: "kube-api-access-jtb4k") pod "05166598-7d6f-485c-8700-7a84a5261e1b" (UID: "05166598-7d6f-485c-8700-7a84a5261e1b"). InnerVolumeSpecName "kube-api-access-jtb4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:13:12 crc kubenswrapper[4760]: I0930 08:13:12.985906 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtb4k\" (UniqueName: \"kubernetes.io/projected/05166598-7d6f-485c-8700-7a84a5261e1b-kube-api-access-jtb4k\") on node \"crc\" DevicePath \"\"" Sep 30 08:13:12 crc kubenswrapper[4760]: I0930 08:13:12.986013 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05166598-7d6f-485c-8700-7a84a5261e1b-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 08:13:13 crc kubenswrapper[4760]: I0930 08:13:13.140837 4760 generic.go:334] "Generic (PLEG): container finished" podID="05166598-7d6f-485c-8700-7a84a5261e1b" containerID="020cad18324e312292efd29a0c12b5a6eb7472217af1bd842d619bc3d63e3214" exitCode=0 Sep 30 08:13:13 crc kubenswrapper[4760]: I0930 08:13:13.140941 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2kgrs" event={"ID":"05166598-7d6f-485c-8700-7a84a5261e1b","Type":"ContainerDied","Data":"020cad18324e312292efd29a0c12b5a6eb7472217af1bd842d619bc3d63e3214"} Sep 30 08:13:13 crc kubenswrapper[4760]: I0930 08:13:13.140949 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2kgrs" Sep 30 08:13:13 crc kubenswrapper[4760]: I0930 08:13:13.140992 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2kgrs" event={"ID":"05166598-7d6f-485c-8700-7a84a5261e1b","Type":"ContainerDied","Data":"db8862c806b41d442670329ba70c8b6a0f85d2e7bdcf98f86cc8075db24014a5"} Sep 30 08:13:13 crc kubenswrapper[4760]: I0930 08:13:13.141022 4760 scope.go:117] "RemoveContainer" containerID="020cad18324e312292efd29a0c12b5a6eb7472217af1bd842d619bc3d63e3214" Sep 30 08:13:13 crc kubenswrapper[4760]: I0930 08:13:13.179217 4760 scope.go:117] "RemoveContainer" containerID="84a991fb4c854c061295d6f14bf7e1a4aff85caa05d013cdd068ec903350716a" Sep 30 08:13:13 crc kubenswrapper[4760]: I0930 08:13:13.224959 4760 scope.go:117] "RemoveContainer" containerID="fed1a350592b51a7b4484ab857bc028c5b88dc62d9f32ce12bd1361c6a4bfc87" Sep 30 08:13:13 crc kubenswrapper[4760]: I0930 08:13:13.248092 4760 scope.go:117] "RemoveContainer" containerID="020cad18324e312292efd29a0c12b5a6eb7472217af1bd842d619bc3d63e3214" Sep 30 08:13:13 crc kubenswrapper[4760]: E0930 08:13:13.249740 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"020cad18324e312292efd29a0c12b5a6eb7472217af1bd842d619bc3d63e3214\": container with ID starting with 020cad18324e312292efd29a0c12b5a6eb7472217af1bd842d619bc3d63e3214 not found: ID does not exist" containerID="020cad18324e312292efd29a0c12b5a6eb7472217af1bd842d619bc3d63e3214" Sep 30 08:13:13 crc kubenswrapper[4760]: I0930 08:13:13.249790 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"020cad18324e312292efd29a0c12b5a6eb7472217af1bd842d619bc3d63e3214"} err="failed to get container status \"020cad18324e312292efd29a0c12b5a6eb7472217af1bd842d619bc3d63e3214\": rpc error: code = NotFound desc = could not find container \"020cad18324e312292efd29a0c12b5a6eb7472217af1bd842d619bc3d63e3214\": container with ID starting with 020cad18324e312292efd29a0c12b5a6eb7472217af1bd842d619bc3d63e3214 not found: ID does not exist" Sep 30 08:13:13 crc kubenswrapper[4760]: I0930 08:13:13.249828 4760 scope.go:117] "RemoveContainer" containerID="84a991fb4c854c061295d6f14bf7e1a4aff85caa05d013cdd068ec903350716a" Sep 30 08:13:13 crc kubenswrapper[4760]: E0930 08:13:13.250296 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84a991fb4c854c061295d6f14bf7e1a4aff85caa05d013cdd068ec903350716a\": container with ID starting with 84a991fb4c854c061295d6f14bf7e1a4aff85caa05d013cdd068ec903350716a not found: ID does not exist" containerID="84a991fb4c854c061295d6f14bf7e1a4aff85caa05d013cdd068ec903350716a" Sep 30 08:13:13 crc kubenswrapper[4760]: I0930 08:13:13.250364 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84a991fb4c854c061295d6f14bf7e1a4aff85caa05d013cdd068ec903350716a"} err="failed to get container status \"84a991fb4c854c061295d6f14bf7e1a4aff85caa05d013cdd068ec903350716a\": rpc error: code = NotFound desc = could not find container \"84a991fb4c854c061295d6f14bf7e1a4aff85caa05d013cdd068ec903350716a\": container with ID starting with 84a991fb4c854c061295d6f14bf7e1a4aff85caa05d013cdd068ec903350716a not found: ID does not exist" Sep 30 08:13:13 crc kubenswrapper[4760]: I0930 08:13:13.250393 4760 scope.go:117] "RemoveContainer" containerID="fed1a350592b51a7b4484ab857bc028c5b88dc62d9f32ce12bd1361c6a4bfc87" Sep 30 08:13:13 crc kubenswrapper[4760]: E0930 08:13:13.252571 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fed1a350592b51a7b4484ab857bc028c5b88dc62d9f32ce12bd1361c6a4bfc87\": container with ID starting with fed1a350592b51a7b4484ab857bc028c5b88dc62d9f32ce12bd1361c6a4bfc87 not found: ID does not exist" containerID="fed1a350592b51a7b4484ab857bc028c5b88dc62d9f32ce12bd1361c6a4bfc87" Sep 30 08:13:13 crc kubenswrapper[4760]: I0930 08:13:13.252625 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fed1a350592b51a7b4484ab857bc028c5b88dc62d9f32ce12bd1361c6a4bfc87"} err="failed to get container status \"fed1a350592b51a7b4484ab857bc028c5b88dc62d9f32ce12bd1361c6a4bfc87\": rpc error: code = NotFound desc = could not find container \"fed1a350592b51a7b4484ab857bc028c5b88dc62d9f32ce12bd1361c6a4bfc87\": container with ID starting with fed1a350592b51a7b4484ab857bc028c5b88dc62d9f32ce12bd1361c6a4bfc87 not found: ID does not exist" Sep 30 08:13:13 crc kubenswrapper[4760]: I0930 08:13:13.628635 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05166598-7d6f-485c-8700-7a84a5261e1b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "05166598-7d6f-485c-8700-7a84a5261e1b" (UID: "05166598-7d6f-485c-8700-7a84a5261e1b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:13:13 crc kubenswrapper[4760]: I0930 08:13:13.704548 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05166598-7d6f-485c-8700-7a84a5261e1b-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 08:13:13 crc kubenswrapper[4760]: I0930 08:13:13.799105 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2kgrs"] Sep 30 08:13:13 crc kubenswrapper[4760]: I0930 08:13:13.810177 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2kgrs"] Sep 30 08:13:14 crc kubenswrapper[4760]: I0930 08:13:14.177419 4760 generic.go:334] "Generic (PLEG): container finished" podID="cfc1fa61-9e5f-4fdc-8e83-cebb7344919b" containerID="61f3a9d29d1293e544bdf1ceabc56e35fbba91814339c1d874904c2536c93c6e" exitCode=0 Sep 30 08:13:14 crc kubenswrapper[4760]: I0930 08:13:14.177512 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4phwd" event={"ID":"cfc1fa61-9e5f-4fdc-8e83-cebb7344919b","Type":"ContainerDied","Data":"61f3a9d29d1293e544bdf1ceabc56e35fbba91814339c1d874904c2536c93c6e"} Sep 30 08:13:14 crc kubenswrapper[4760]: I0930 08:13:14.611676 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4phwd" Sep 30 08:13:14 crc kubenswrapper[4760]: I0930 08:13:14.628887 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfc1fa61-9e5f-4fdc-8e83-cebb7344919b-catalog-content\") pod \"cfc1fa61-9e5f-4fdc-8e83-cebb7344919b\" (UID: \"cfc1fa61-9e5f-4fdc-8e83-cebb7344919b\") " Sep 30 08:13:14 crc kubenswrapper[4760]: I0930 08:13:14.629017 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfc1fa61-9e5f-4fdc-8e83-cebb7344919b-utilities\") pod \"cfc1fa61-9e5f-4fdc-8e83-cebb7344919b\" (UID: \"cfc1fa61-9e5f-4fdc-8e83-cebb7344919b\") " Sep 30 08:13:14 crc kubenswrapper[4760]: I0930 08:13:14.629191 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q67ks\" (UniqueName: \"kubernetes.io/projected/cfc1fa61-9e5f-4fdc-8e83-cebb7344919b-kube-api-access-q67ks\") pod \"cfc1fa61-9e5f-4fdc-8e83-cebb7344919b\" (UID: \"cfc1fa61-9e5f-4fdc-8e83-cebb7344919b\") " Sep 30 08:13:14 crc kubenswrapper[4760]: I0930 08:13:14.631099 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfc1fa61-9e5f-4fdc-8e83-cebb7344919b-utilities" (OuterVolumeSpecName: "utilities") pod "cfc1fa61-9e5f-4fdc-8e83-cebb7344919b" (UID: "cfc1fa61-9e5f-4fdc-8e83-cebb7344919b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:13:14 crc kubenswrapper[4760]: I0930 08:13:14.641592 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfc1fa61-9e5f-4fdc-8e83-cebb7344919b-kube-api-access-q67ks" (OuterVolumeSpecName: "kube-api-access-q67ks") pod "cfc1fa61-9e5f-4fdc-8e83-cebb7344919b" (UID: "cfc1fa61-9e5f-4fdc-8e83-cebb7344919b"). InnerVolumeSpecName "kube-api-access-q67ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:13:14 crc kubenswrapper[4760]: I0930 08:13:14.699696 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfc1fa61-9e5f-4fdc-8e83-cebb7344919b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cfc1fa61-9e5f-4fdc-8e83-cebb7344919b" (UID: "cfc1fa61-9e5f-4fdc-8e83-cebb7344919b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:13:14 crc kubenswrapper[4760]: I0930 08:13:14.733729 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q67ks\" (UniqueName: \"kubernetes.io/projected/cfc1fa61-9e5f-4fdc-8e83-cebb7344919b-kube-api-access-q67ks\") on node \"crc\" DevicePath \"\"" Sep 30 08:13:14 crc kubenswrapper[4760]: I0930 08:13:14.733779 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfc1fa61-9e5f-4fdc-8e83-cebb7344919b-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 08:13:14 crc kubenswrapper[4760]: I0930 08:13:14.733792 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfc1fa61-9e5f-4fdc-8e83-cebb7344919b-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 08:13:15 crc kubenswrapper[4760]: I0930 08:13:15.080026 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05166598-7d6f-485c-8700-7a84a5261e1b" path="/var/lib/kubelet/pods/05166598-7d6f-485c-8700-7a84a5261e1b/volumes" Sep 30 08:13:15 crc kubenswrapper[4760]: I0930 08:13:15.195327 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4phwd" event={"ID":"cfc1fa61-9e5f-4fdc-8e83-cebb7344919b","Type":"ContainerDied","Data":"2b54f38e7ec07e855cda15b5869f07c572643a508b7346ae5181318a92e52ed1"} Sep 30 08:13:15 crc kubenswrapper[4760]: I0930 08:13:15.195414 4760 scope.go:117] "RemoveContainer" containerID="61f3a9d29d1293e544bdf1ceabc56e35fbba91814339c1d874904c2536c93c6e" Sep 30 08:13:15 crc kubenswrapper[4760]: I0930 08:13:15.195413 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4phwd" Sep 30 08:13:15 crc kubenswrapper[4760]: I0930 08:13:15.229464 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4phwd"] Sep 30 08:13:15 crc kubenswrapper[4760]: I0930 08:13:15.237897 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4phwd"] Sep 30 08:13:15 crc kubenswrapper[4760]: I0930 08:13:15.241179 4760 scope.go:117] "RemoveContainer" containerID="8e9fbdf3546b80c496acad0592dfb715547b468fbdb8902d15dd623a07cd537e" Sep 30 08:13:15 crc kubenswrapper[4760]: I0930 08:13:15.275115 4760 scope.go:117] "RemoveContainer" containerID="0570bf5a1f61fb546f688391a613937ab48da49b5432050de9a40e4b5a3f888e" Sep 30 08:13:17 crc kubenswrapper[4760]: I0930 08:13:17.067518 4760 scope.go:117] "RemoveContainer" containerID="78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" Sep 30 08:13:17 crc kubenswrapper[4760]: E0930 08:13:17.068210 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:13:17 crc kubenswrapper[4760]: I0930 08:13:17.082521 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfc1fa61-9e5f-4fdc-8e83-cebb7344919b" path="/var/lib/kubelet/pods/cfc1fa61-9e5f-4fdc-8e83-cebb7344919b/volumes" Sep 30 08:13:29 crc kubenswrapper[4760]: I0930 08:13:29.067640 4760 scope.go:117] "RemoveContainer" containerID="78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" Sep 30 08:13:29 crc kubenswrapper[4760]: E0930 08:13:29.069114 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:13:40 crc kubenswrapper[4760]: I0930 08:13:40.068104 4760 scope.go:117] "RemoveContainer" containerID="78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" Sep 30 08:13:40 crc kubenswrapper[4760]: E0930 08:13:40.068985 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:13:51 crc kubenswrapper[4760]: I0930 08:13:51.067379 4760 scope.go:117] "RemoveContainer" containerID="78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" Sep 30 08:13:51 crc kubenswrapper[4760]: E0930 08:13:51.068287 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:14:03 crc kubenswrapper[4760]: I0930 08:14:03.067858 4760 scope.go:117] "RemoveContainer" containerID="78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" Sep 30 08:14:03 crc kubenswrapper[4760]: E0930 08:14:03.068885 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:14:14 crc kubenswrapper[4760]: I0930 08:14:14.067434 4760 scope.go:117] "RemoveContainer" containerID="78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" Sep 30 08:14:14 crc kubenswrapper[4760]: E0930 08:14:14.069112 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:14:28 crc kubenswrapper[4760]: I0930 08:14:28.067974 4760 scope.go:117] "RemoveContainer" containerID="78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" Sep 30 08:14:28 crc kubenswrapper[4760]: E0930 08:14:28.068975 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:14:42 crc kubenswrapper[4760]: I0930 08:14:42.067140 4760 scope.go:117] "RemoveContainer" containerID="78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" Sep 30 08:14:42 crc kubenswrapper[4760]: E0930 08:14:42.068498 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:14:54 crc kubenswrapper[4760]: I0930 08:14:54.069178 4760 scope.go:117] "RemoveContainer" containerID="78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" Sep 30 08:14:54 crc kubenswrapper[4760]: I0930 08:14:54.367866 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerStarted","Data":"6995e454dc7acd7e872934378c7d2f24cfd58f25173eb39c76a5116f44789266"} Sep 30 08:15:00 crc kubenswrapper[4760]: I0930 08:15:00.153533 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320335-zr8b5"] Sep 30 08:15:00 crc kubenswrapper[4760]: E0930 08:15:00.160393 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc1fa61-9e5f-4fdc-8e83-cebb7344919b" containerName="extract-content" Sep 30 08:15:00 crc kubenswrapper[4760]: I0930 08:15:00.160426 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc1fa61-9e5f-4fdc-8e83-cebb7344919b" containerName="extract-content" Sep 30 08:15:00 crc kubenswrapper[4760]: E0930 08:15:00.160448 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05166598-7d6f-485c-8700-7a84a5261e1b" containerName="extract-content" Sep 30 08:15:00 crc kubenswrapper[4760]: I0930 08:15:00.160456 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="05166598-7d6f-485c-8700-7a84a5261e1b" containerName="extract-content" Sep 30 08:15:00 crc kubenswrapper[4760]: E0930 08:15:00.160473 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05166598-7d6f-485c-8700-7a84a5261e1b" containerName="registry-server" Sep 30 08:15:00 crc kubenswrapper[4760]: I0930 08:15:00.160480 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="05166598-7d6f-485c-8700-7a84a5261e1b" containerName="registry-server" Sep 30 08:15:00 crc kubenswrapper[4760]: E0930 08:15:00.160495 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05166598-7d6f-485c-8700-7a84a5261e1b" containerName="extract-utilities" Sep 30 08:15:00 crc kubenswrapper[4760]: I0930 08:15:00.160503 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="05166598-7d6f-485c-8700-7a84a5261e1b" containerName="extract-utilities" Sep 30 08:15:00 crc kubenswrapper[4760]: E0930 08:15:00.160514 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc1fa61-9e5f-4fdc-8e83-cebb7344919b" containerName="extract-utilities" Sep 30 08:15:00 crc kubenswrapper[4760]: I0930 08:15:00.160522 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc1fa61-9e5f-4fdc-8e83-cebb7344919b" containerName="extract-utilities" Sep 30 08:15:00 crc kubenswrapper[4760]: E0930 08:15:00.160589 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc1fa61-9e5f-4fdc-8e83-cebb7344919b" containerName="registry-server" Sep 30 08:15:00 crc kubenswrapper[4760]: I0930 08:15:00.160598 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc1fa61-9e5f-4fdc-8e83-cebb7344919b" containerName="registry-server" Sep 30 08:15:00 crc kubenswrapper[4760]: I0930 08:15:00.160845 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfc1fa61-9e5f-4fdc-8e83-cebb7344919b" containerName="registry-server" Sep 30 08:15:00 crc kubenswrapper[4760]: I0930 08:15:00.160872 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="05166598-7d6f-485c-8700-7a84a5261e1b" containerName="registry-server" Sep 30 08:15:00 crc kubenswrapper[4760]: I0930 08:15:00.164564 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320335-zr8b5" Sep 30 08:15:00 crc kubenswrapper[4760]: I0930 08:15:00.170391 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320335-zr8b5"] Sep 30 08:15:00 crc kubenswrapper[4760]: I0930 08:15:00.171244 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 08:15:00 crc kubenswrapper[4760]: I0930 08:15:00.171262 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 08:15:00 crc kubenswrapper[4760]: I0930 08:15:00.372500 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6e89699-0bf0-4749-b268-967ef8499eae-secret-volume\") pod \"collect-profiles-29320335-zr8b5\" (UID: \"d6e89699-0bf0-4749-b268-967ef8499eae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320335-zr8b5" Sep 30 08:15:00 crc kubenswrapper[4760]: I0930 08:15:00.373017 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmbzt\" (UniqueName: \"kubernetes.io/projected/d6e89699-0bf0-4749-b268-967ef8499eae-kube-api-access-vmbzt\") pod \"collect-profiles-29320335-zr8b5\" (UID: \"d6e89699-0bf0-4749-b268-967ef8499eae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320335-zr8b5" Sep 30 08:15:00 crc kubenswrapper[4760]: I0930 08:15:00.373230 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6e89699-0bf0-4749-b268-967ef8499eae-config-volume\") pod \"collect-profiles-29320335-zr8b5\" (UID: \"d6e89699-0bf0-4749-b268-967ef8499eae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320335-zr8b5" Sep 30 08:15:00 crc kubenswrapper[4760]: I0930 08:15:00.475016 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6e89699-0bf0-4749-b268-967ef8499eae-secret-volume\") pod \"collect-profiles-29320335-zr8b5\" (UID: \"d6e89699-0bf0-4749-b268-967ef8499eae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320335-zr8b5" Sep 30 08:15:00 crc kubenswrapper[4760]: I0930 08:15:00.475098 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmbzt\" (UniqueName: \"kubernetes.io/projected/d6e89699-0bf0-4749-b268-967ef8499eae-kube-api-access-vmbzt\") pod \"collect-profiles-29320335-zr8b5\" (UID: \"d6e89699-0bf0-4749-b268-967ef8499eae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320335-zr8b5" Sep 30 08:15:00 crc kubenswrapper[4760]: I0930 08:15:00.475214 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6e89699-0bf0-4749-b268-967ef8499eae-config-volume\") pod \"collect-profiles-29320335-zr8b5\" (UID: \"d6e89699-0bf0-4749-b268-967ef8499eae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320335-zr8b5" Sep 30 08:15:00 crc kubenswrapper[4760]: I0930 08:15:00.476469 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6e89699-0bf0-4749-b268-967ef8499eae-config-volume\") pod \"collect-profiles-29320335-zr8b5\" (UID: \"d6e89699-0bf0-4749-b268-967ef8499eae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320335-zr8b5" Sep 30 08:15:00 crc kubenswrapper[4760]: I0930 08:15:00.482433 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6e89699-0bf0-4749-b268-967ef8499eae-secret-volume\") pod \"collect-profiles-29320335-zr8b5\" (UID: \"d6e89699-0bf0-4749-b268-967ef8499eae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320335-zr8b5" Sep 30 08:15:00 crc kubenswrapper[4760]: I0930 08:15:00.496186 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmbzt\" (UniqueName: \"kubernetes.io/projected/d6e89699-0bf0-4749-b268-967ef8499eae-kube-api-access-vmbzt\") pod \"collect-profiles-29320335-zr8b5\" (UID: \"d6e89699-0bf0-4749-b268-967ef8499eae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320335-zr8b5" Sep 30 08:15:00 crc kubenswrapper[4760]: I0930 08:15:00.792122 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320335-zr8b5" Sep 30 08:15:01 crc kubenswrapper[4760]: I0930 08:15:01.110317 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320335-zr8b5"] Sep 30 08:15:01 crc kubenswrapper[4760]: I0930 08:15:01.445472 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320335-zr8b5" event={"ID":"d6e89699-0bf0-4749-b268-967ef8499eae","Type":"ContainerStarted","Data":"5b05ae1dbf71a64d4e9167a735f528a75f6de5af266832ccf05df8077d8891ae"} Sep 30 08:15:01 crc kubenswrapper[4760]: I0930 08:15:01.445510 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320335-zr8b5" event={"ID":"d6e89699-0bf0-4749-b268-967ef8499eae","Type":"ContainerStarted","Data":"f1a1abe02c9b5410fffb2d83a1405f46ac1ef2924cb572f539bc036585cd249e"} Sep 30 08:15:01 crc kubenswrapper[4760]: I0930 08:15:01.466172 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320335-zr8b5" podStartSLOduration=1.466156355 podStartE2EDuration="1.466156355s" podCreationTimestamp="2025-09-30 08:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 08:15:01.459850414 +0000 UTC m=+2487.102756826" watchObservedRunningTime="2025-09-30 08:15:01.466156355 +0000 UTC m=+2487.109062767" Sep 30 08:15:02 crc kubenswrapper[4760]: I0930 08:15:02.454763 4760 generic.go:334] "Generic (PLEG): container finished" podID="d6e89699-0bf0-4749-b268-967ef8499eae" containerID="5b05ae1dbf71a64d4e9167a735f528a75f6de5af266832ccf05df8077d8891ae" exitCode=0 Sep 30 08:15:02 crc kubenswrapper[4760]: I0930 08:15:02.454816 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320335-zr8b5" event={"ID":"d6e89699-0bf0-4749-b268-967ef8499eae","Type":"ContainerDied","Data":"5b05ae1dbf71a64d4e9167a735f528a75f6de5af266832ccf05df8077d8891ae"} Sep 30 08:15:03 crc kubenswrapper[4760]: I0930 08:15:03.831811 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320335-zr8b5" Sep 30 08:15:03 crc kubenswrapper[4760]: I0930 08:15:03.947787 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6e89699-0bf0-4749-b268-967ef8499eae-config-volume\") pod \"d6e89699-0bf0-4749-b268-967ef8499eae\" (UID: \"d6e89699-0bf0-4749-b268-967ef8499eae\") " Sep 30 08:15:03 crc kubenswrapper[4760]: I0930 08:15:03.947898 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmbzt\" (UniqueName: \"kubernetes.io/projected/d6e89699-0bf0-4749-b268-967ef8499eae-kube-api-access-vmbzt\") pod \"d6e89699-0bf0-4749-b268-967ef8499eae\" (UID: \"d6e89699-0bf0-4749-b268-967ef8499eae\") " Sep 30 08:15:03 crc kubenswrapper[4760]: I0930 08:15:03.947999 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6e89699-0bf0-4749-b268-967ef8499eae-secret-volume\") pod \"d6e89699-0bf0-4749-b268-967ef8499eae\" (UID: \"d6e89699-0bf0-4749-b268-967ef8499eae\") " Sep 30 08:15:03 crc kubenswrapper[4760]: I0930 08:15:03.948936 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6e89699-0bf0-4749-b268-967ef8499eae-config-volume" (OuterVolumeSpecName: "config-volume") pod "d6e89699-0bf0-4749-b268-967ef8499eae" (UID: "d6e89699-0bf0-4749-b268-967ef8499eae"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 08:15:03 crc kubenswrapper[4760]: I0930 08:15:03.955359 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6e89699-0bf0-4749-b268-967ef8499eae-kube-api-access-vmbzt" (OuterVolumeSpecName: "kube-api-access-vmbzt") pod "d6e89699-0bf0-4749-b268-967ef8499eae" (UID: "d6e89699-0bf0-4749-b268-967ef8499eae"). InnerVolumeSpecName "kube-api-access-vmbzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:15:03 crc kubenswrapper[4760]: I0930 08:15:03.955376 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e89699-0bf0-4749-b268-967ef8499eae-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d6e89699-0bf0-4749-b268-967ef8499eae" (UID: "d6e89699-0bf0-4749-b268-967ef8499eae"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:15:04 crc kubenswrapper[4760]: I0930 08:15:04.049793 4760 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6e89699-0bf0-4749-b268-967ef8499eae-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 08:15:04 crc kubenswrapper[4760]: I0930 08:15:04.049832 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmbzt\" (UniqueName: \"kubernetes.io/projected/d6e89699-0bf0-4749-b268-967ef8499eae-kube-api-access-vmbzt\") on node \"crc\" DevicePath \"\"" Sep 30 08:15:04 crc kubenswrapper[4760]: I0930 08:15:04.049846 4760 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6e89699-0bf0-4749-b268-967ef8499eae-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 08:15:04 crc kubenswrapper[4760]: I0930 08:15:04.512447 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320335-zr8b5" event={"ID":"d6e89699-0bf0-4749-b268-967ef8499eae","Type":"ContainerDied","Data":"f1a1abe02c9b5410fffb2d83a1405f46ac1ef2924cb572f539bc036585cd249e"} Sep 30 08:15:04 crc kubenswrapper[4760]: I0930 08:15:04.512527 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1a1abe02c9b5410fffb2d83a1405f46ac1ef2924cb572f539bc036585cd249e" Sep 30 08:15:04 crc kubenswrapper[4760]: I0930 08:15:04.512555 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320335-zr8b5" Sep 30 08:15:04 crc kubenswrapper[4760]: I0930 08:15:04.555803 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320290-w8n6p"] Sep 30 08:15:04 crc kubenswrapper[4760]: I0930 08:15:04.564533 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320290-w8n6p"] Sep 30 08:15:05 crc kubenswrapper[4760]: I0930 08:15:05.078747 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db316304-e02f-43df-b4e2-6cdd6ee3b7eb" path="/var/lib/kubelet/pods/db316304-e02f-43df-b4e2-6cdd6ee3b7eb/volumes" Sep 30 08:15:45 crc kubenswrapper[4760]: I0930 08:15:45.983217 4760 scope.go:117] "RemoveContainer" containerID="a6a4df1c779996b80c57ea79d6eadfb720c63d9facc8b5e93311cdc717996acc" Sep 30 08:16:01 crc kubenswrapper[4760]: E0930 08:16:01.373177 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33cc4d6c_b086_410c_b38e_f6c918657a74.slice/crio-c9bf51b5a8f437438e4c9920e49205135a53165a7dfdd2343f200c36ba338b7c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33cc4d6c_b086_410c_b38e_f6c918657a74.slice/crio-conmon-c9bf51b5a8f437438e4c9920e49205135a53165a7dfdd2343f200c36ba338b7c.scope\": RecentStats: unable to find data in memory cache]" Sep 30 08:16:02 crc kubenswrapper[4760]: I0930 08:16:02.206692 4760 generic.go:334] "Generic (PLEG): container finished" podID="33cc4d6c-b086-410c-b38e-f6c918657a74" containerID="c9bf51b5a8f437438e4c9920e49205135a53165a7dfdd2343f200c36ba338b7c" exitCode=0 Sep 30 08:16:02 crc kubenswrapper[4760]: I0930 08:16:02.206757 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" event={"ID":"33cc4d6c-b086-410c-b38e-f6c918657a74","Type":"ContainerDied","Data":"c9bf51b5a8f437438e4c9920e49205135a53165a7dfdd2343f200c36ba338b7c"} Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.641660 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.770160 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-ssh-key\") pod \"33cc4d6c-b086-410c-b38e-f6c918657a74\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.770258 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-migration-ssh-key-1\") pod \"33cc4d6c-b086-410c-b38e-f6c918657a74\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.770348 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-cell1-compute-config-0\") pod \"33cc4d6c-b086-410c-b38e-f6c918657a74\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.770428 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-migration-ssh-key-0\") pod \"33cc4d6c-b086-410c-b38e-f6c918657a74\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.770620 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-combined-ca-bundle\") pod \"33cc4d6c-b086-410c-b38e-f6c918657a74\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.770751 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-cell1-compute-config-1\") pod \"33cc4d6c-b086-410c-b38e-f6c918657a74\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.770840 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-extra-config-0\") pod \"33cc4d6c-b086-410c-b38e-f6c918657a74\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.770918 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cts9\" (UniqueName: \"kubernetes.io/projected/33cc4d6c-b086-410c-b38e-f6c918657a74-kube-api-access-5cts9\") pod \"33cc4d6c-b086-410c-b38e-f6c918657a74\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.770967 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-inventory\") pod \"33cc4d6c-b086-410c-b38e-f6c918657a74\" (UID: \"33cc4d6c-b086-410c-b38e-f6c918657a74\") " Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.779161 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "33cc4d6c-b086-410c-b38e-f6c918657a74" (UID: "33cc4d6c-b086-410c-b38e-f6c918657a74"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.784539 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33cc4d6c-b086-410c-b38e-f6c918657a74-kube-api-access-5cts9" (OuterVolumeSpecName: "kube-api-access-5cts9") pod "33cc4d6c-b086-410c-b38e-f6c918657a74" (UID: "33cc4d6c-b086-410c-b38e-f6c918657a74"). InnerVolumeSpecName "kube-api-access-5cts9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.800876 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "33cc4d6c-b086-410c-b38e-f6c918657a74" (UID: "33cc4d6c-b086-410c-b38e-f6c918657a74"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.806208 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "33cc4d6c-b086-410c-b38e-f6c918657a74" (UID: "33cc4d6c-b086-410c-b38e-f6c918657a74"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.807882 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "33cc4d6c-b086-410c-b38e-f6c918657a74" (UID: "33cc4d6c-b086-410c-b38e-f6c918657a74"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.808838 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "33cc4d6c-b086-410c-b38e-f6c918657a74" (UID: "33cc4d6c-b086-410c-b38e-f6c918657a74"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.818693 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-inventory" (OuterVolumeSpecName: "inventory") pod "33cc4d6c-b086-410c-b38e-f6c918657a74" (UID: "33cc4d6c-b086-410c-b38e-f6c918657a74"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.821876 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "33cc4d6c-b086-410c-b38e-f6c918657a74" (UID: "33cc4d6c-b086-410c-b38e-f6c918657a74"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.843268 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "33cc4d6c-b086-410c-b38e-f6c918657a74" (UID: "33cc4d6c-b086-410c-b38e-f6c918657a74"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.880035 4760 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.880121 4760 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.880142 4760 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.880158 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cts9\" (UniqueName: \"kubernetes.io/projected/33cc4d6c-b086-410c-b38e-f6c918657a74-kube-api-access-5cts9\") on node \"crc\" DevicePath \"\"" Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.880174 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.880188 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.880201 4760 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.880216 4760 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 08:16:03 crc kubenswrapper[4760]: I0930 08:16:03.880231 4760 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/33cc4d6c-b086-410c-b38e-f6c918657a74-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.234212 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" event={"ID":"33cc4d6c-b086-410c-b38e-f6c918657a74","Type":"ContainerDied","Data":"b3293324ff185099a460f9a45019748f079796dec06393b8682a5c6b935c870e"} Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.234259 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3293324ff185099a460f9a45019748f079796dec06393b8682a5c6b935c870e" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.234282 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5ncgd" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.345214 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g"] Sep 30 08:16:04 crc kubenswrapper[4760]: E0930 08:16:04.345704 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33cc4d6c-b086-410c-b38e-f6c918657a74" containerName="nova-edpm-deployment-openstack-edpm-ipam" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.345731 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="33cc4d6c-b086-410c-b38e-f6c918657a74" containerName="nova-edpm-deployment-openstack-edpm-ipam" Sep 30 08:16:04 crc kubenswrapper[4760]: E0930 08:16:04.345799 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e89699-0bf0-4749-b268-967ef8499eae" containerName="collect-profiles" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.345812 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e89699-0bf0-4749-b268-967ef8499eae" containerName="collect-profiles" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.346114 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="33cc4d6c-b086-410c-b38e-f6c918657a74" containerName="nova-edpm-deployment-openstack-edpm-ipam" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.346139 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e89699-0bf0-4749-b268-967ef8499eae" containerName="collect-profiles" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.347046 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.351256 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.351284 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.351372 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.351921 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8gxrf" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.352489 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.356913 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g"] Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.493618 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.493912 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v59xn\" (UniqueName: \"kubernetes.io/projected/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-kube-api-access-v59xn\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.493946 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.493986 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.494033 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.494063 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.494099 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.595795 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.595925 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.596009 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v59xn\" (UniqueName: \"kubernetes.io/projected/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-kube-api-access-v59xn\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.596069 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.596147 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.596238 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.596323 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.602775 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.603208 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.604469 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.605127 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.605428 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.606747 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.613729 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v59xn\" (UniqueName: \"kubernetes.io/projected/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-kube-api-access-v59xn\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" Sep 30 08:16:04 crc kubenswrapper[4760]: I0930 08:16:04.669517 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" Sep 30 08:16:05 crc kubenswrapper[4760]: I0930 08:16:05.262365 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g"] Sep 30 08:16:05 crc kubenswrapper[4760]: W0930 08:16:05.264914 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d944c3c_b8ab_4a31_a8c6_aa086a0d02fd.slice/crio-2e4cca52797776889bad68862ee29a39175862d684da36fa73403108d99495f7 WatchSource:0}: Error finding container 2e4cca52797776889bad68862ee29a39175862d684da36fa73403108d99495f7: Status 404 returned error can't find the container with id 2e4cca52797776889bad68862ee29a39175862d684da36fa73403108d99495f7 Sep 30 08:16:05 crc kubenswrapper[4760]: I0930 08:16:05.268606 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 08:16:06 crc kubenswrapper[4760]: I0930 08:16:06.284771 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" event={"ID":"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd","Type":"ContainerStarted","Data":"2317e6b72106cfd1d07f316f4abe06fa09d0eca298a59abc8faaa07dc37a2370"} Sep 30 08:16:06 crc kubenswrapper[4760]: I0930 08:16:06.288495 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" event={"ID":"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd","Type":"ContainerStarted","Data":"2e4cca52797776889bad68862ee29a39175862d684da36fa73403108d99495f7"} Sep 30 08:16:06 crc kubenswrapper[4760]: I0930 08:16:06.312715 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" podStartSLOduration=1.917769195 podStartE2EDuration="2.31268735s" podCreationTimestamp="2025-09-30 08:16:04 +0000 UTC" firstStartedPulling="2025-09-30 08:16:05.268202079 +0000 UTC m=+2550.911108511" lastFinishedPulling="2025-09-30 08:16:05.663120254 +0000 UTC m=+2551.306026666" observedRunningTime="2025-09-30 08:16:06.297882741 +0000 UTC m=+2551.940789173" watchObservedRunningTime="2025-09-30 08:16:06.31268735 +0000 UTC m=+2551.955593782" Sep 30 08:17:19 crc kubenswrapper[4760]: I0930 08:17:19.112760 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:17:19 crc kubenswrapper[4760]: I0930 08:17:19.113597 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:17:49 crc kubenswrapper[4760]: I0930 08:17:49.112830 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:17:49 crc kubenswrapper[4760]: I0930 08:17:49.113440 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:18:19 crc kubenswrapper[4760]: I0930 08:18:19.113152 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:18:19 crc kubenswrapper[4760]: I0930 08:18:19.113732 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:18:19 crc kubenswrapper[4760]: I0930 08:18:19.113772 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 08:18:19 crc kubenswrapper[4760]: I0930 08:18:19.114488 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6995e454dc7acd7e872934378c7d2f24cfd58f25173eb39c76a5116f44789266"} pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 08:18:19 crc kubenswrapper[4760]: I0930 08:18:19.114562 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" containerID="cri-o://6995e454dc7acd7e872934378c7d2f24cfd58f25173eb39c76a5116f44789266" gracePeriod=600 Sep 30 08:18:19 crc kubenswrapper[4760]: I0930 08:18:19.782048 4760 generic.go:334] "Generic (PLEG): container finished" podID="7a9c8270-6964-4886-87d0-227b05b76da4" containerID="6995e454dc7acd7e872934378c7d2f24cfd58f25173eb39c76a5116f44789266" exitCode=0 Sep 30 08:18:19 crc kubenswrapper[4760]: I0930 08:18:19.782416 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerDied","Data":"6995e454dc7acd7e872934378c7d2f24cfd58f25173eb39c76a5116f44789266"} Sep 30 08:18:19 crc kubenswrapper[4760]: I0930 08:18:19.782457 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerStarted","Data":"ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b"} Sep 30 08:18:19 crc kubenswrapper[4760]: I0930 08:18:19.782485 4760 scope.go:117] "RemoveContainer" containerID="78abf2ac3aa0bb6cb3e98a1200a88befbc3a37ea54b03c8ea3af358b0119d01b" Sep 30 08:18:45 crc kubenswrapper[4760]: I0930 08:18:45.086165 4760 generic.go:334] "Generic (PLEG): container finished" podID="9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd" containerID="2317e6b72106cfd1d07f316f4abe06fa09d0eca298a59abc8faaa07dc37a2370" exitCode=0 Sep 30 08:18:45 crc kubenswrapper[4760]: I0930 08:18:45.098778 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" event={"ID":"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd","Type":"ContainerDied","Data":"2317e6b72106cfd1d07f316f4abe06fa09d0eca298a59abc8faaa07dc37a2370"} Sep 30 08:18:46 crc kubenswrapper[4760]: I0930 08:18:46.577729 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" Sep 30 08:18:46 crc kubenswrapper[4760]: I0930 08:18:46.647117 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-ceilometer-compute-config-data-1\") pod \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " Sep 30 08:18:46 crc kubenswrapper[4760]: I0930 08:18:46.647175 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-telemetry-combined-ca-bundle\") pod \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " Sep 30 08:18:46 crc kubenswrapper[4760]: I0930 08:18:46.647281 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-ceilometer-compute-config-data-2\") pod \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " Sep 30 08:18:46 crc kubenswrapper[4760]: I0930 08:18:46.647344 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-inventory\") pod \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " Sep 30 08:18:46 crc kubenswrapper[4760]: I0930 08:18:46.647382 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-ceilometer-compute-config-data-0\") pod \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " Sep 30 08:18:46 crc kubenswrapper[4760]: I0930 08:18:46.647417 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v59xn\" (UniqueName: \"kubernetes.io/projected/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-kube-api-access-v59xn\") pod \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " Sep 30 08:18:46 crc kubenswrapper[4760]: I0930 08:18:46.647433 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-ssh-key\") pod \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\" (UID: \"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd\") " Sep 30 08:18:46 crc kubenswrapper[4760]: I0930 08:18:46.654621 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd" (UID: "9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:18:46 crc kubenswrapper[4760]: I0930 08:18:46.654760 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-kube-api-access-v59xn" (OuterVolumeSpecName: "kube-api-access-v59xn") pod "9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd" (UID: "9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd"). InnerVolumeSpecName "kube-api-access-v59xn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:18:46 crc kubenswrapper[4760]: I0930 08:18:46.675200 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-inventory" (OuterVolumeSpecName: "inventory") pod "9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd" (UID: "9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:18:46 crc kubenswrapper[4760]: I0930 08:18:46.682151 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd" (UID: "9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:18:46 crc kubenswrapper[4760]: I0930 08:18:46.684495 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd" (UID: "9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:18:46 crc kubenswrapper[4760]: I0930 08:18:46.705562 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd" (UID: "9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:18:46 crc kubenswrapper[4760]: I0930 08:18:46.715346 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd" (UID: "9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:18:46 crc kubenswrapper[4760]: I0930 08:18:46.750600 4760 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 08:18:46 crc kubenswrapper[4760]: I0930 08:18:46.750637 4760 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Sep 30 08:18:46 crc kubenswrapper[4760]: I0930 08:18:46.750647 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 08:18:46 crc kubenswrapper[4760]: I0930 08:18:46.750658 4760 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Sep 30 08:18:46 crc kubenswrapper[4760]: I0930 08:18:46.750668 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v59xn\" (UniqueName: \"kubernetes.io/projected/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-kube-api-access-v59xn\") on node \"crc\" DevicePath \"\"" Sep 30 08:18:46 crc kubenswrapper[4760]: I0930 08:18:46.750677 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 08:18:46 crc kubenswrapper[4760]: I0930 08:18:46.750685 4760 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Sep 30 08:18:47 crc kubenswrapper[4760]: I0930 08:18:47.109601 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" event={"ID":"9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd","Type":"ContainerDied","Data":"2e4cca52797776889bad68862ee29a39175862d684da36fa73403108d99495f7"} Sep 30 08:18:47 crc kubenswrapper[4760]: I0930 08:18:47.110010 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e4cca52797776889bad68862ee29a39175862d684da36fa73403108d99495f7" Sep 30 08:18:47 crc kubenswrapper[4760]: I0930 08:18:47.110090 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g" Sep 30 08:19:21 crc kubenswrapper[4760]: I0930 08:19:21.657019 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 08:19:21 crc kubenswrapper[4760]: I0930 08:19:21.661751 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="9784a9e1-42ad-4f0b-ae43-a3227158a763" containerName="prometheus" containerID="cri-o://63ec4e758f6a709110fe3d253fc67a8a6f598cba4026ccd8b9f1b87abaa502fe" gracePeriod=600 Sep 30 08:19:21 crc kubenswrapper[4760]: I0930 08:19:21.661854 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="9784a9e1-42ad-4f0b-ae43-a3227158a763" containerName="config-reloader" containerID="cri-o://ccd8af5e4644fe30831ecfe237486ca035bb69adba8cba7b97ba5bd89895aa0a" gracePeriod=600 Sep 30 08:19:21 crc kubenswrapper[4760]: I0930 08:19:21.661891 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="9784a9e1-42ad-4f0b-ae43-a3227158a763" containerName="thanos-sidecar" containerID="cri-o://a2b8b7aedb37924ff3f231ff7625453b8ca43df8219f8c517b32529fc171db6a" gracePeriod=600 Sep 30 08:19:21 crc kubenswrapper[4760]: I0930 08:19:21.853036 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="9784a9e1-42ad-4f0b-ae43-a3227158a763" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.132:9090/-/ready\": dial tcp 10.217.0.132:9090: connect: connection refused" Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.558495 4760 generic.go:334] "Generic (PLEG): container finished" podID="9784a9e1-42ad-4f0b-ae43-a3227158a763" containerID="a2b8b7aedb37924ff3f231ff7625453b8ca43df8219f8c517b32529fc171db6a" exitCode=0 Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.560583 4760 generic.go:334] "Generic (PLEG): container finished" podID="9784a9e1-42ad-4f0b-ae43-a3227158a763" containerID="ccd8af5e4644fe30831ecfe237486ca035bb69adba8cba7b97ba5bd89895aa0a" exitCode=0 Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.560659 4760 generic.go:334] "Generic (PLEG): container finished" podID="9784a9e1-42ad-4f0b-ae43-a3227158a763" containerID="63ec4e758f6a709110fe3d253fc67a8a6f598cba4026ccd8b9f1b87abaa502fe" exitCode=0 Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.558596 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9784a9e1-42ad-4f0b-ae43-a3227158a763","Type":"ContainerDied","Data":"a2b8b7aedb37924ff3f231ff7625453b8ca43df8219f8c517b32529fc171db6a"} Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.560774 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9784a9e1-42ad-4f0b-ae43-a3227158a763","Type":"ContainerDied","Data":"ccd8af5e4644fe30831ecfe237486ca035bb69adba8cba7b97ba5bd89895aa0a"} Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.560830 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9784a9e1-42ad-4f0b-ae43-a3227158a763","Type":"ContainerDied","Data":"63ec4e758f6a709110fe3d253fc67a8a6f598cba4026ccd8b9f1b87abaa502fe"} Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.805491 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.915609 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-config\") pod \"9784a9e1-42ad-4f0b-ae43-a3227158a763\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.915756 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9784a9e1-42ad-4f0b-ae43-a3227158a763-config-out\") pod \"9784a9e1-42ad-4f0b-ae43-a3227158a763\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.915860 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-secret-combined-ca-bundle\") pod \"9784a9e1-42ad-4f0b-ae43-a3227158a763\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.916447 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\") pod \"9784a9e1-42ad-4f0b-ae43-a3227158a763\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.916557 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9784a9e1-42ad-4f0b-ae43-a3227158a763-tls-assets\") pod \"9784a9e1-42ad-4f0b-ae43-a3227158a763\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.916627 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9784a9e1-42ad-4f0b-ae43-a3227158a763-prometheus-metric-storage-rulefiles-0\") pod \"9784a9e1-42ad-4f0b-ae43-a3227158a763\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.916704 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-thanos-prometheus-http-client-file\") pod \"9784a9e1-42ad-4f0b-ae43-a3227158a763\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.916736 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9pv4\" (UniqueName: \"kubernetes.io/projected/9784a9e1-42ad-4f0b-ae43-a3227158a763-kube-api-access-x9pv4\") pod \"9784a9e1-42ad-4f0b-ae43-a3227158a763\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.916765 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-web-config\") pod \"9784a9e1-42ad-4f0b-ae43-a3227158a763\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.916838 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"9784a9e1-42ad-4f0b-ae43-a3227158a763\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.916867 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"9784a9e1-42ad-4f0b-ae43-a3227158a763\" (UID: \"9784a9e1-42ad-4f0b-ae43-a3227158a763\") " Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.920689 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9784a9e1-42ad-4f0b-ae43-a3227158a763-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "9784a9e1-42ad-4f0b-ae43-a3227158a763" (UID: "9784a9e1-42ad-4f0b-ae43-a3227158a763"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.922778 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9784a9e1-42ad-4f0b-ae43-a3227158a763-config-out" (OuterVolumeSpecName: "config-out") pod "9784a9e1-42ad-4f0b-ae43-a3227158a763" (UID: "9784a9e1-42ad-4f0b-ae43-a3227158a763"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.924681 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "9784a9e1-42ad-4f0b-ae43-a3227158a763" (UID: "9784a9e1-42ad-4f0b-ae43-a3227158a763"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.926620 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-config" (OuterVolumeSpecName: "config") pod "9784a9e1-42ad-4f0b-ae43-a3227158a763" (UID: "9784a9e1-42ad-4f0b-ae43-a3227158a763"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.926792 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9784a9e1-42ad-4f0b-ae43-a3227158a763-kube-api-access-x9pv4" (OuterVolumeSpecName: "kube-api-access-x9pv4") pod "9784a9e1-42ad-4f0b-ae43-a3227158a763" (UID: "9784a9e1-42ad-4f0b-ae43-a3227158a763"). InnerVolumeSpecName "kube-api-access-x9pv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.927231 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "9784a9e1-42ad-4f0b-ae43-a3227158a763" (UID: "9784a9e1-42ad-4f0b-ae43-a3227158a763"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.928733 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9784a9e1-42ad-4f0b-ae43-a3227158a763-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "9784a9e1-42ad-4f0b-ae43-a3227158a763" (UID: "9784a9e1-42ad-4f0b-ae43-a3227158a763"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.936578 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "9784a9e1-42ad-4f0b-ae43-a3227158a763" (UID: "9784a9e1-42ad-4f0b-ae43-a3227158a763"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.939599 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "9784a9e1-42ad-4f0b-ae43-a3227158a763" (UID: "9784a9e1-42ad-4f0b-ae43-a3227158a763"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:19:22 crc kubenswrapper[4760]: I0930 08:19:22.979612 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09238572-8d9b-4684-8ed8-661e43a35d9a" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "9784a9e1-42ad-4f0b-ae43-a3227158a763" (UID: "9784a9e1-42ad-4f0b-ae43-a3227158a763"). InnerVolumeSpecName "pvc-09238572-8d9b-4684-8ed8-661e43a35d9a". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.004641 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-web-config" (OuterVolumeSpecName: "web-config") pod "9784a9e1-42ad-4f0b-ae43-a3227158a763" (UID: "9784a9e1-42ad-4f0b-ae43-a3227158a763"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.021507 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9pv4\" (UniqueName: \"kubernetes.io/projected/9784a9e1-42ad-4f0b-ae43-a3227158a763-kube-api-access-x9pv4\") on node \"crc\" DevicePath \"\"" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.021573 4760 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-web-config\") on node \"crc\" DevicePath \"\"" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.021588 4760 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.021601 4760 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.021612 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-config\") on node \"crc\" DevicePath \"\"" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.021622 4760 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9784a9e1-42ad-4f0b-ae43-a3227158a763-config-out\") on node \"crc\" DevicePath \"\"" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.021650 4760 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.021677 4760 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\") on node \"crc\" " Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.021687 4760 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9784a9e1-42ad-4f0b-ae43-a3227158a763-tls-assets\") on node \"crc\" DevicePath \"\"" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.021697 4760 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9784a9e1-42ad-4f0b-ae43-a3227158a763-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.021708 4760 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9784a9e1-42ad-4f0b-ae43-a3227158a763-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.054486 4760 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.055116 4760 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-09238572-8d9b-4684-8ed8-661e43a35d9a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09238572-8d9b-4684-8ed8-661e43a35d9a") on node "crc" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.123352 4760 reconciler_common.go:293] "Volume detached for volume \"pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\") on node \"crc\" DevicePath \"\"" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.574653 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9784a9e1-42ad-4f0b-ae43-a3227158a763","Type":"ContainerDied","Data":"0c2f5b551d5f84405c9171a006c64b79e11468c1368b3e3e1b0e909242737498"} Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.574722 4760 scope.go:117] "RemoveContainer" containerID="a2b8b7aedb37924ff3f231ff7625453b8ca43df8219f8c517b32529fc171db6a" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.574762 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.605933 4760 scope.go:117] "RemoveContainer" containerID="ccd8af5e4644fe30831ecfe237486ca035bb69adba8cba7b97ba5bd89895aa0a" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.611741 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.623957 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.638072 4760 scope.go:117] "RemoveContainer" containerID="63ec4e758f6a709110fe3d253fc67a8a6f598cba4026ccd8b9f1b87abaa502fe" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.642962 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 08:19:23 crc kubenswrapper[4760]: E0930 08:19:23.643527 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9784a9e1-42ad-4f0b-ae43-a3227158a763" containerName="config-reloader" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.643572 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="9784a9e1-42ad-4f0b-ae43-a3227158a763" containerName="config-reloader" Sep 30 08:19:23 crc kubenswrapper[4760]: E0930 08:19:23.643583 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9784a9e1-42ad-4f0b-ae43-a3227158a763" containerName="init-config-reloader" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.643589 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="9784a9e1-42ad-4f0b-ae43-a3227158a763" containerName="init-config-reloader" Sep 30 08:19:23 crc kubenswrapper[4760]: E0930 08:19:23.643604 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9784a9e1-42ad-4f0b-ae43-a3227158a763" containerName="thanos-sidecar" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.643609 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="9784a9e1-42ad-4f0b-ae43-a3227158a763" containerName="thanos-sidecar" Sep 30 08:19:23 crc kubenswrapper[4760]: E0930 08:19:23.643624 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.643631 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 30 08:19:23 crc kubenswrapper[4760]: E0930 08:19:23.643652 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9784a9e1-42ad-4f0b-ae43-a3227158a763" containerName="prometheus" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.643658 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="9784a9e1-42ad-4f0b-ae43-a3227158a763" containerName="prometheus" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.643896 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="9784a9e1-42ad-4f0b-ae43-a3227158a763" containerName="config-reloader" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.643944 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="9784a9e1-42ad-4f0b-ae43-a3227158a763" containerName="thanos-sidecar" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.643967 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="9784a9e1-42ad-4f0b-ae43-a3227158a763" containerName="prometheus" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.643977 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.648031 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.650436 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.650818 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.652755 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.653420 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.654281 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-pr2wc" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.661463 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.663662 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.670238 4760 scope.go:117] "RemoveContainer" containerID="ac43d8db1c046801aea5d4bff64433379ff5d6e2825e451966ef62e72f5121df" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.736735 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.736772 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.736809 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.736849 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.736878 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.737010 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klg6k\" (UniqueName: \"kubernetes.io/projected/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-kube-api-access-klg6k\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.737100 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.737136 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.737206 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.737354 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.737441 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-config\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.838876 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.838940 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-config\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.838981 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.839004 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.839040 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.839063 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.839093 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.839130 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klg6k\" (UniqueName: \"kubernetes.io/projected/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-kube-api-access-klg6k\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.839154 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.839182 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.839211 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.844076 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.844286 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.844109 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.852424 4760 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.852474 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/15f1bcf6ef2a65343cb29c53094f20376472cc1b8d5a343d6a63d664da0c3f7a/globalmount\"" pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.858623 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.858692 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.859568 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.859063 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.858869 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.861781 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-config\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.875049 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klg6k\" (UniqueName: \"kubernetes.io/projected/33ff1fe6-1a25-49ab-8b11-97ab06ee2e43-kube-api-access-klg6k\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.901940 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09238572-8d9b-4684-8ed8-661e43a35d9a\") pod \"prometheus-metric-storage-0\" (UID: \"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43\") " pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:23 crc kubenswrapper[4760]: I0930 08:19:23.976519 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:24 crc kubenswrapper[4760]: I0930 08:19:24.481072 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 08:19:24 crc kubenswrapper[4760]: W0930 08:19:24.485441 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33ff1fe6_1a25_49ab_8b11_97ab06ee2e43.slice/crio-5488b160b3cd834a3b64209f5ec239f80d879c371820cc568bef9e476c891fb6 WatchSource:0}: Error finding container 5488b160b3cd834a3b64209f5ec239f80d879c371820cc568bef9e476c891fb6: Status 404 returned error can't find the container with id 5488b160b3cd834a3b64209f5ec239f80d879c371820cc568bef9e476c891fb6 Sep 30 08:19:24 crc kubenswrapper[4760]: I0930 08:19:24.591194 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43","Type":"ContainerStarted","Data":"5488b160b3cd834a3b64209f5ec239f80d879c371820cc568bef9e476c891fb6"} Sep 30 08:19:25 crc kubenswrapper[4760]: I0930 08:19:25.101010 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9784a9e1-42ad-4f0b-ae43-a3227158a763" path="/var/lib/kubelet/pods/9784a9e1-42ad-4f0b-ae43-a3227158a763/volumes" Sep 30 08:19:29 crc kubenswrapper[4760]: I0930 08:19:29.656932 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43","Type":"ContainerStarted","Data":"4eb398ed71f409d13ccbead6dd6866d2d45932aa8b4f1af7df7f439295f85c7e"} Sep 30 08:19:38 crc kubenswrapper[4760]: I0930 08:19:38.788071 4760 generic.go:334] "Generic (PLEG): container finished" podID="33ff1fe6-1a25-49ab-8b11-97ab06ee2e43" containerID="4eb398ed71f409d13ccbead6dd6866d2d45932aa8b4f1af7df7f439295f85c7e" exitCode=0 Sep 30 08:19:38 crc kubenswrapper[4760]: I0930 08:19:38.788181 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43","Type":"ContainerDied","Data":"4eb398ed71f409d13ccbead6dd6866d2d45932aa8b4f1af7df7f439295f85c7e"} Sep 30 08:19:39 crc kubenswrapper[4760]: I0930 08:19:39.808861 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43","Type":"ContainerStarted","Data":"f478c2963c4434a6bd5e66a5c8746f2154805f20530b07626ac445e443046397"} Sep 30 08:19:43 crc kubenswrapper[4760]: I0930 08:19:43.889862 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43","Type":"ContainerStarted","Data":"b477cfc23efae796054c367a59f087114f874ddb55438ac635a3a11a562e6e7f"} Sep 30 08:19:43 crc kubenswrapper[4760]: I0930 08:19:43.890385 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"33ff1fe6-1a25-49ab-8b11-97ab06ee2e43","Type":"ContainerStarted","Data":"4a3f3ffa1f5f8f782acdf611d34a5702af2faa959b5a2342a9e4daa8b2dee432"} Sep 30 08:19:43 crc kubenswrapper[4760]: I0930 08:19:43.926963 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=20.926942823 podStartE2EDuration="20.926942823s" podCreationTimestamp="2025-09-30 08:19:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 08:19:43.914225968 +0000 UTC m=+2769.557132420" watchObservedRunningTime="2025-09-30 08:19:43.926942823 +0000 UTC m=+2769.569849255" Sep 30 08:19:43 crc kubenswrapper[4760]: I0930 08:19:43.976760 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:53 crc kubenswrapper[4760]: I0930 08:19:53.976989 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:53 crc kubenswrapper[4760]: I0930 08:19:53.988123 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Sep 30 08:19:54 crc kubenswrapper[4760]: I0930 08:19:54.021348 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.112811 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.113388 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.526621 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.529644 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.532662 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.534245 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-shxsl" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.534577 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.537877 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.557719 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.668585 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/394b8542-fe18-475f-9374-ce5c7e3820e7-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.668703 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/394b8542-fe18-475f-9374-ce5c7e3820e7-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.668765 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/394b8542-fe18-475f-9374-ce5c7e3820e7-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.669108 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/394b8542-fe18-475f-9374-ce5c7e3820e7-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.669369 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whvpw\" (UniqueName: \"kubernetes.io/projected/394b8542-fe18-475f-9374-ce5c7e3820e7-kube-api-access-whvpw\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.669562 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.669635 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/394b8542-fe18-475f-9374-ce5c7e3820e7-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.669814 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/394b8542-fe18-475f-9374-ce5c7e3820e7-config-data\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.669889 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/394b8542-fe18-475f-9374-ce5c7e3820e7-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.772422 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/394b8542-fe18-475f-9374-ce5c7e3820e7-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.772516 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/394b8542-fe18-475f-9374-ce5c7e3820e7-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.772617 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/394b8542-fe18-475f-9374-ce5c7e3820e7-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.772720 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whvpw\" (UniqueName: \"kubernetes.io/projected/394b8542-fe18-475f-9374-ce5c7e3820e7-kube-api-access-whvpw\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.772836 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.772886 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/394b8542-fe18-475f-9374-ce5c7e3820e7-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.772981 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/394b8542-fe18-475f-9374-ce5c7e3820e7-config-data\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.773030 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/394b8542-fe18-475f-9374-ce5c7e3820e7-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.773064 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/394b8542-fe18-475f-9374-ce5c7e3820e7-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.773365 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/394b8542-fe18-475f-9374-ce5c7e3820e7-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.773830 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/394b8542-fe18-475f-9374-ce5c7e3820e7-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.774011 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.775009 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/394b8542-fe18-475f-9374-ce5c7e3820e7-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.775566 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/394b8542-fe18-475f-9374-ce5c7e3820e7-config-data\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.782045 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/394b8542-fe18-475f-9374-ce5c7e3820e7-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.787093 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/394b8542-fe18-475f-9374-ce5c7e3820e7-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.787358 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/394b8542-fe18-475f-9374-ce5c7e3820e7-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.810022 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whvpw\" (UniqueName: \"kubernetes.io/projected/394b8542-fe18-475f-9374-ce5c7e3820e7-kube-api-access-whvpw\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.827964 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " pod="openstack/tempest-tests-tempest" Sep 30 08:20:19 crc kubenswrapper[4760]: I0930 08:20:19.864972 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 08:20:20 crc kubenswrapper[4760]: I0930 08:20:20.392756 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Sep 30 08:20:21 crc kubenswrapper[4760]: I0930 08:20:21.374286 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"394b8542-fe18-475f-9374-ce5c7e3820e7","Type":"ContainerStarted","Data":"ab6cc4e73a67cebb20574aff849f85f2641d5b4a1c900236cbf073b06206a87b"} Sep 30 08:20:38 crc kubenswrapper[4760]: I0930 08:20:38.401847 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-6cc97c56c5-7pkjn" podUID="fc788440-e748-4b41-bdb6-23a6764062fd" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Sep 30 08:20:39 crc kubenswrapper[4760]: E0930 08:20:39.626482 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.110:5001/podified-epoxy-centos9/openstack-tempest-all:watcher_latest" Sep 30 08:20:39 crc kubenswrapper[4760]: E0930 08:20:39.626815 4760 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.110:5001/podified-epoxy-centos9/openstack-tempest-all:watcher_latest" Sep 30 08:20:39 crc kubenswrapper[4760]: E0930 08:20:39.626972 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:38.102.83.110:5001/podified-epoxy-centos9/openstack-tempest-all:watcher_latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-whvpw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(394b8542-fe18-475f-9374-ce5c7e3820e7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 08:20:39 crc kubenswrapper[4760]: E0930 08:20:39.628229 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="394b8542-fe18-475f-9374-ce5c7e3820e7" Sep 30 08:20:40 crc kubenswrapper[4760]: E0930 08:20:40.620703 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.110:5001/podified-epoxy-centos9/openstack-tempest-all:watcher_latest\\\"\"" pod="openstack/tempest-tests-tempest" podUID="394b8542-fe18-475f-9374-ce5c7e3820e7" Sep 30 08:20:49 crc kubenswrapper[4760]: I0930 08:20:49.113062 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:20:49 crc kubenswrapper[4760]: I0930 08:20:49.113856 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:20:56 crc kubenswrapper[4760]: I0930 08:20:56.159354 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Sep 30 08:20:57 crc kubenswrapper[4760]: I0930 08:20:57.836117 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"394b8542-fe18-475f-9374-ce5c7e3820e7","Type":"ContainerStarted","Data":"a5489eda4b2de9c64278d0bb594e161c51f0e89f024771ff637c7cb465e09b5b"} Sep 30 08:20:57 crc kubenswrapper[4760]: I0930 08:20:57.868055 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.102572058 podStartE2EDuration="39.868028899s" podCreationTimestamp="2025-09-30 08:20:18 +0000 UTC" firstStartedPulling="2025-09-30 08:20:20.389731171 +0000 UTC m=+2806.032637583" lastFinishedPulling="2025-09-30 08:20:56.155188012 +0000 UTC m=+2841.798094424" observedRunningTime="2025-09-30 08:20:57.857851739 +0000 UTC m=+2843.500758181" watchObservedRunningTime="2025-09-30 08:20:57.868028899 +0000 UTC m=+2843.510935311" Sep 30 08:21:12 crc kubenswrapper[4760]: I0930 08:21:12.603138 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zxmhb"] Sep 30 08:21:12 crc kubenswrapper[4760]: I0930 08:21:12.606392 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zxmhb" Sep 30 08:21:12 crc kubenswrapper[4760]: I0930 08:21:12.664565 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zxmhb"] Sep 30 08:21:12 crc kubenswrapper[4760]: I0930 08:21:12.700292 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62eb1141-74f4-4c8f-ab84-c9ed65e4bc12-catalog-content\") pod \"redhat-operators-zxmhb\" (UID: \"62eb1141-74f4-4c8f-ab84-c9ed65e4bc12\") " pod="openshift-marketplace/redhat-operators-zxmhb" Sep 30 08:21:12 crc kubenswrapper[4760]: I0930 08:21:12.700424 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62eb1141-74f4-4c8f-ab84-c9ed65e4bc12-utilities\") pod \"redhat-operators-zxmhb\" (UID: \"62eb1141-74f4-4c8f-ab84-c9ed65e4bc12\") " pod="openshift-marketplace/redhat-operators-zxmhb" Sep 30 08:21:12 crc kubenswrapper[4760]: I0930 08:21:12.700454 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frwv4\" (UniqueName: \"kubernetes.io/projected/62eb1141-74f4-4c8f-ab84-c9ed65e4bc12-kube-api-access-frwv4\") pod \"redhat-operators-zxmhb\" (UID: \"62eb1141-74f4-4c8f-ab84-c9ed65e4bc12\") " pod="openshift-marketplace/redhat-operators-zxmhb" Sep 30 08:21:12 crc kubenswrapper[4760]: I0930 08:21:12.801623 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62eb1141-74f4-4c8f-ab84-c9ed65e4bc12-catalog-content\") pod \"redhat-operators-zxmhb\" (UID: \"62eb1141-74f4-4c8f-ab84-c9ed65e4bc12\") " pod="openshift-marketplace/redhat-operators-zxmhb" Sep 30 08:21:12 crc kubenswrapper[4760]: I0930 08:21:12.802216 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62eb1141-74f4-4c8f-ab84-c9ed65e4bc12-utilities\") pod \"redhat-operators-zxmhb\" (UID: \"62eb1141-74f4-4c8f-ab84-c9ed65e4bc12\") " pod="openshift-marketplace/redhat-operators-zxmhb" Sep 30 08:21:12 crc kubenswrapper[4760]: I0930 08:21:12.802320 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frwv4\" (UniqueName: \"kubernetes.io/projected/62eb1141-74f4-4c8f-ab84-c9ed65e4bc12-kube-api-access-frwv4\") pod \"redhat-operators-zxmhb\" (UID: \"62eb1141-74f4-4c8f-ab84-c9ed65e4bc12\") " pod="openshift-marketplace/redhat-operators-zxmhb" Sep 30 08:21:12 crc kubenswrapper[4760]: I0930 08:21:12.802833 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62eb1141-74f4-4c8f-ab84-c9ed65e4bc12-utilities\") pod \"redhat-operators-zxmhb\" (UID: \"62eb1141-74f4-4c8f-ab84-c9ed65e4bc12\") " pod="openshift-marketplace/redhat-operators-zxmhb" Sep 30 08:21:12 crc kubenswrapper[4760]: I0930 08:21:12.802782 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62eb1141-74f4-4c8f-ab84-c9ed65e4bc12-catalog-content\") pod \"redhat-operators-zxmhb\" (UID: \"62eb1141-74f4-4c8f-ab84-c9ed65e4bc12\") " pod="openshift-marketplace/redhat-operators-zxmhb" Sep 30 08:21:12 crc kubenswrapper[4760]: I0930 08:21:12.831402 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frwv4\" (UniqueName: \"kubernetes.io/projected/62eb1141-74f4-4c8f-ab84-c9ed65e4bc12-kube-api-access-frwv4\") pod \"redhat-operators-zxmhb\" (UID: \"62eb1141-74f4-4c8f-ab84-c9ed65e4bc12\") " pod="openshift-marketplace/redhat-operators-zxmhb" Sep 30 08:21:12 crc kubenswrapper[4760]: I0930 08:21:12.949168 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zxmhb" Sep 30 08:21:13 crc kubenswrapper[4760]: I0930 08:21:13.464871 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zxmhb"] Sep 30 08:21:14 crc kubenswrapper[4760]: I0930 08:21:14.033671 4760 generic.go:334] "Generic (PLEG): container finished" podID="62eb1141-74f4-4c8f-ab84-c9ed65e4bc12" containerID="676156c96fea9c64670b3685dcd7dab54162daa5b3bea923c58d6507920318d0" exitCode=0 Sep 30 08:21:14 crc kubenswrapper[4760]: I0930 08:21:14.034071 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxmhb" event={"ID":"62eb1141-74f4-4c8f-ab84-c9ed65e4bc12","Type":"ContainerDied","Data":"676156c96fea9c64670b3685dcd7dab54162daa5b3bea923c58d6507920318d0"} Sep 30 08:21:14 crc kubenswrapper[4760]: I0930 08:21:14.034121 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxmhb" event={"ID":"62eb1141-74f4-4c8f-ab84-c9ed65e4bc12","Type":"ContainerStarted","Data":"4b7021b113345995c1e4720bfaaea5a65acaccc761e999c2decb0b10fd5fff04"} Sep 30 08:21:14 crc kubenswrapper[4760]: I0930 08:21:14.037879 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 08:21:15 crc kubenswrapper[4760]: I0930 08:21:15.051014 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxmhb" event={"ID":"62eb1141-74f4-4c8f-ab84-c9ed65e4bc12","Type":"ContainerStarted","Data":"24c5028b7815a800b5d88d3d7c6f2049e51092952fdeee3ae3c5fb2203c7596f"} Sep 30 08:21:16 crc kubenswrapper[4760]: I0930 08:21:16.067251 4760 generic.go:334] "Generic (PLEG): container finished" podID="62eb1141-74f4-4c8f-ab84-c9ed65e4bc12" containerID="24c5028b7815a800b5d88d3d7c6f2049e51092952fdeee3ae3c5fb2203c7596f" exitCode=0 Sep 30 08:21:16 crc kubenswrapper[4760]: I0930 08:21:16.067473 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxmhb" event={"ID":"62eb1141-74f4-4c8f-ab84-c9ed65e4bc12","Type":"ContainerDied","Data":"24c5028b7815a800b5d88d3d7c6f2049e51092952fdeee3ae3c5fb2203c7596f"} Sep 30 08:21:17 crc kubenswrapper[4760]: I0930 08:21:17.090372 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxmhb" event={"ID":"62eb1141-74f4-4c8f-ab84-c9ed65e4bc12","Type":"ContainerStarted","Data":"786f12b55806787257aecc76efa37c2036f4d56e453a073b163b85c0b28dd168"} Sep 30 08:21:18 crc kubenswrapper[4760]: I0930 08:21:18.131966 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zxmhb" podStartSLOduration=3.608121803 podStartE2EDuration="6.131949136s" podCreationTimestamp="2025-09-30 08:21:12 +0000 UTC" firstStartedPulling="2025-09-30 08:21:14.037252398 +0000 UTC m=+2859.680158850" lastFinishedPulling="2025-09-30 08:21:16.561079731 +0000 UTC m=+2862.203986183" observedRunningTime="2025-09-30 08:21:18.117170128 +0000 UTC m=+2863.760076540" watchObservedRunningTime="2025-09-30 08:21:18.131949136 +0000 UTC m=+2863.774855548" Sep 30 08:21:19 crc kubenswrapper[4760]: I0930 08:21:19.113175 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:21:19 crc kubenswrapper[4760]: I0930 08:21:19.113264 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:21:19 crc kubenswrapper[4760]: I0930 08:21:19.113401 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 08:21:19 crc kubenswrapper[4760]: I0930 08:21:19.114533 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b"} pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 08:21:19 crc kubenswrapper[4760]: I0930 08:21:19.114646 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" containerID="cri-o://ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" gracePeriod=600 Sep 30 08:21:19 crc kubenswrapper[4760]: E0930 08:21:19.263093 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:21:20 crc kubenswrapper[4760]: I0930 08:21:20.131527 4760 generic.go:334] "Generic (PLEG): container finished" podID="7a9c8270-6964-4886-87d0-227b05b76da4" containerID="ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" exitCode=0 Sep 30 08:21:20 crc kubenswrapper[4760]: I0930 08:21:20.131595 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerDied","Data":"ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b"} Sep 30 08:21:20 crc kubenswrapper[4760]: I0930 08:21:20.131640 4760 scope.go:117] "RemoveContainer" containerID="6995e454dc7acd7e872934378c7d2f24cfd58f25173eb39c76a5116f44789266" Sep 30 08:21:20 crc kubenswrapper[4760]: I0930 08:21:20.132635 4760 scope.go:117] "RemoveContainer" containerID="ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" Sep 30 08:21:20 crc kubenswrapper[4760]: E0930 08:21:20.133281 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:21:22 crc kubenswrapper[4760]: I0930 08:21:22.949755 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zxmhb" Sep 30 08:21:22 crc kubenswrapper[4760]: I0930 08:21:22.949833 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zxmhb" Sep 30 08:21:23 crc kubenswrapper[4760]: I0930 08:21:23.026510 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zxmhb" Sep 30 08:21:23 crc kubenswrapper[4760]: I0930 08:21:23.254893 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zxmhb" Sep 30 08:21:23 crc kubenswrapper[4760]: I0930 08:21:23.327933 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zxmhb"] Sep 30 08:21:25 crc kubenswrapper[4760]: I0930 08:21:25.225811 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zxmhb" podUID="62eb1141-74f4-4c8f-ab84-c9ed65e4bc12" containerName="registry-server" containerID="cri-o://786f12b55806787257aecc76efa37c2036f4d56e453a073b163b85c0b28dd168" gracePeriod=2 Sep 30 08:21:25 crc kubenswrapper[4760]: I0930 08:21:25.765850 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zxmhb" Sep 30 08:21:25 crc kubenswrapper[4760]: I0930 08:21:25.888958 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62eb1141-74f4-4c8f-ab84-c9ed65e4bc12-catalog-content\") pod \"62eb1141-74f4-4c8f-ab84-c9ed65e4bc12\" (UID: \"62eb1141-74f4-4c8f-ab84-c9ed65e4bc12\") " Sep 30 08:21:25 crc kubenswrapper[4760]: I0930 08:21:25.889031 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frwv4\" (UniqueName: \"kubernetes.io/projected/62eb1141-74f4-4c8f-ab84-c9ed65e4bc12-kube-api-access-frwv4\") pod \"62eb1141-74f4-4c8f-ab84-c9ed65e4bc12\" (UID: \"62eb1141-74f4-4c8f-ab84-c9ed65e4bc12\") " Sep 30 08:21:25 crc kubenswrapper[4760]: I0930 08:21:25.889512 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62eb1141-74f4-4c8f-ab84-c9ed65e4bc12-utilities\") pod \"62eb1141-74f4-4c8f-ab84-c9ed65e4bc12\" (UID: \"62eb1141-74f4-4c8f-ab84-c9ed65e4bc12\") " Sep 30 08:21:25 crc kubenswrapper[4760]: I0930 08:21:25.890545 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62eb1141-74f4-4c8f-ab84-c9ed65e4bc12-utilities" (OuterVolumeSpecName: "utilities") pod "62eb1141-74f4-4c8f-ab84-c9ed65e4bc12" (UID: "62eb1141-74f4-4c8f-ab84-c9ed65e4bc12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:21:25 crc kubenswrapper[4760]: I0930 08:21:25.899239 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62eb1141-74f4-4c8f-ab84-c9ed65e4bc12-kube-api-access-frwv4" (OuterVolumeSpecName: "kube-api-access-frwv4") pod "62eb1141-74f4-4c8f-ab84-c9ed65e4bc12" (UID: "62eb1141-74f4-4c8f-ab84-c9ed65e4bc12"). InnerVolumeSpecName "kube-api-access-frwv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:21:25 crc kubenswrapper[4760]: I0930 08:21:25.990772 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62eb1141-74f4-4c8f-ab84-c9ed65e4bc12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "62eb1141-74f4-4c8f-ab84-c9ed65e4bc12" (UID: "62eb1141-74f4-4c8f-ab84-c9ed65e4bc12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:21:25 crc kubenswrapper[4760]: I0930 08:21:25.992602 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62eb1141-74f4-4c8f-ab84-c9ed65e4bc12-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 08:21:25 crc kubenswrapper[4760]: I0930 08:21:25.992627 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62eb1141-74f4-4c8f-ab84-c9ed65e4bc12-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 08:21:25 crc kubenswrapper[4760]: I0930 08:21:25.992638 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frwv4\" (UniqueName: \"kubernetes.io/projected/62eb1141-74f4-4c8f-ab84-c9ed65e4bc12-kube-api-access-frwv4\") on node \"crc\" DevicePath \"\"" Sep 30 08:21:26 crc kubenswrapper[4760]: I0930 08:21:26.241681 4760 generic.go:334] "Generic (PLEG): container finished" podID="62eb1141-74f4-4c8f-ab84-c9ed65e4bc12" containerID="786f12b55806787257aecc76efa37c2036f4d56e453a073b163b85c0b28dd168" exitCode=0 Sep 30 08:21:26 crc kubenswrapper[4760]: I0930 08:21:26.241740 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxmhb" event={"ID":"62eb1141-74f4-4c8f-ab84-c9ed65e4bc12","Type":"ContainerDied","Data":"786f12b55806787257aecc76efa37c2036f4d56e453a073b163b85c0b28dd168"} Sep 30 08:21:26 crc kubenswrapper[4760]: I0930 08:21:26.242042 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxmhb" event={"ID":"62eb1141-74f4-4c8f-ab84-c9ed65e4bc12","Type":"ContainerDied","Data":"4b7021b113345995c1e4720bfaaea5a65acaccc761e999c2decb0b10fd5fff04"} Sep 30 08:21:26 crc kubenswrapper[4760]: I0930 08:21:26.242065 4760 scope.go:117] "RemoveContainer" containerID="786f12b55806787257aecc76efa37c2036f4d56e453a073b163b85c0b28dd168" Sep 30 08:21:26 crc kubenswrapper[4760]: I0930 08:21:26.241782 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zxmhb" Sep 30 08:21:26 crc kubenswrapper[4760]: I0930 08:21:26.279576 4760 scope.go:117] "RemoveContainer" containerID="24c5028b7815a800b5d88d3d7c6f2049e51092952fdeee3ae3c5fb2203c7596f" Sep 30 08:21:26 crc kubenswrapper[4760]: I0930 08:21:26.293971 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zxmhb"] Sep 30 08:21:26 crc kubenswrapper[4760]: I0930 08:21:26.307344 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zxmhb"] Sep 30 08:21:26 crc kubenswrapper[4760]: I0930 08:21:26.316841 4760 scope.go:117] "RemoveContainer" containerID="676156c96fea9c64670b3685dcd7dab54162daa5b3bea923c58d6507920318d0" Sep 30 08:21:26 crc kubenswrapper[4760]: I0930 08:21:26.383040 4760 scope.go:117] "RemoveContainer" containerID="786f12b55806787257aecc76efa37c2036f4d56e453a073b163b85c0b28dd168" Sep 30 08:21:26 crc kubenswrapper[4760]: E0930 08:21:26.383658 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"786f12b55806787257aecc76efa37c2036f4d56e453a073b163b85c0b28dd168\": container with ID starting with 786f12b55806787257aecc76efa37c2036f4d56e453a073b163b85c0b28dd168 not found: ID does not exist" containerID="786f12b55806787257aecc76efa37c2036f4d56e453a073b163b85c0b28dd168" Sep 30 08:21:26 crc kubenswrapper[4760]: I0930 08:21:26.383716 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"786f12b55806787257aecc76efa37c2036f4d56e453a073b163b85c0b28dd168"} err="failed to get container status \"786f12b55806787257aecc76efa37c2036f4d56e453a073b163b85c0b28dd168\": rpc error: code = NotFound desc = could not find container \"786f12b55806787257aecc76efa37c2036f4d56e453a073b163b85c0b28dd168\": container with ID starting with 786f12b55806787257aecc76efa37c2036f4d56e453a073b163b85c0b28dd168 not found: ID does not exist" Sep 30 08:21:26 crc kubenswrapper[4760]: I0930 08:21:26.383751 4760 scope.go:117] "RemoveContainer" containerID="24c5028b7815a800b5d88d3d7c6f2049e51092952fdeee3ae3c5fb2203c7596f" Sep 30 08:21:26 crc kubenswrapper[4760]: E0930 08:21:26.384182 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24c5028b7815a800b5d88d3d7c6f2049e51092952fdeee3ae3c5fb2203c7596f\": container with ID starting with 24c5028b7815a800b5d88d3d7c6f2049e51092952fdeee3ae3c5fb2203c7596f not found: ID does not exist" containerID="24c5028b7815a800b5d88d3d7c6f2049e51092952fdeee3ae3c5fb2203c7596f" Sep 30 08:21:26 crc kubenswrapper[4760]: I0930 08:21:26.384245 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24c5028b7815a800b5d88d3d7c6f2049e51092952fdeee3ae3c5fb2203c7596f"} err="failed to get container status \"24c5028b7815a800b5d88d3d7c6f2049e51092952fdeee3ae3c5fb2203c7596f\": rpc error: code = NotFound desc = could not find container \"24c5028b7815a800b5d88d3d7c6f2049e51092952fdeee3ae3c5fb2203c7596f\": container with ID starting with 24c5028b7815a800b5d88d3d7c6f2049e51092952fdeee3ae3c5fb2203c7596f not found: ID does not exist" Sep 30 08:21:26 crc kubenswrapper[4760]: I0930 08:21:26.384290 4760 scope.go:117] "RemoveContainer" containerID="676156c96fea9c64670b3685dcd7dab54162daa5b3bea923c58d6507920318d0" Sep 30 08:21:26 crc kubenswrapper[4760]: E0930 08:21:26.385126 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"676156c96fea9c64670b3685dcd7dab54162daa5b3bea923c58d6507920318d0\": container with ID starting with 676156c96fea9c64670b3685dcd7dab54162daa5b3bea923c58d6507920318d0 not found: ID does not exist" containerID="676156c96fea9c64670b3685dcd7dab54162daa5b3bea923c58d6507920318d0" Sep 30 08:21:26 crc kubenswrapper[4760]: I0930 08:21:26.385162 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"676156c96fea9c64670b3685dcd7dab54162daa5b3bea923c58d6507920318d0"} err="failed to get container status \"676156c96fea9c64670b3685dcd7dab54162daa5b3bea923c58d6507920318d0\": rpc error: code = NotFound desc = could not find container \"676156c96fea9c64670b3685dcd7dab54162daa5b3bea923c58d6507920318d0\": container with ID starting with 676156c96fea9c64670b3685dcd7dab54162daa5b3bea923c58d6507920318d0 not found: ID does not exist" Sep 30 08:21:27 crc kubenswrapper[4760]: I0930 08:21:27.083420 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62eb1141-74f4-4c8f-ab84-c9ed65e4bc12" path="/var/lib/kubelet/pods/62eb1141-74f4-4c8f-ab84-c9ed65e4bc12/volumes" Sep 30 08:21:31 crc kubenswrapper[4760]: I0930 08:21:31.068009 4760 scope.go:117] "RemoveContainer" containerID="ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" Sep 30 08:21:31 crc kubenswrapper[4760]: E0930 08:21:31.068647 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:21:42 crc kubenswrapper[4760]: I0930 08:21:42.067025 4760 scope.go:117] "RemoveContainer" containerID="ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" Sep 30 08:21:42 crc kubenswrapper[4760]: E0930 08:21:42.067799 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:21:54 crc kubenswrapper[4760]: I0930 08:21:54.067464 4760 scope.go:117] "RemoveContainer" containerID="ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" Sep 30 08:21:54 crc kubenswrapper[4760]: E0930 08:21:54.068238 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:22:05 crc kubenswrapper[4760]: I0930 08:22:05.077936 4760 scope.go:117] "RemoveContainer" containerID="ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" Sep 30 08:22:05 crc kubenswrapper[4760]: E0930 08:22:05.078649 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:22:19 crc kubenswrapper[4760]: I0930 08:22:19.070005 4760 scope.go:117] "RemoveContainer" containerID="ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" Sep 30 08:22:19 crc kubenswrapper[4760]: E0930 08:22:19.070953 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:22:31 crc kubenswrapper[4760]: I0930 08:22:31.073653 4760 scope.go:117] "RemoveContainer" containerID="ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" Sep 30 08:22:31 crc kubenswrapper[4760]: E0930 08:22:31.075703 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:22:44 crc kubenswrapper[4760]: I0930 08:22:44.067349 4760 scope.go:117] "RemoveContainer" containerID="ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" Sep 30 08:22:44 crc kubenswrapper[4760]: E0930 08:22:44.068756 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:22:58 crc kubenswrapper[4760]: I0930 08:22:58.068439 4760 scope.go:117] "RemoveContainer" containerID="ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" Sep 30 08:22:58 crc kubenswrapper[4760]: E0930 08:22:58.069843 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:23:13 crc kubenswrapper[4760]: I0930 08:23:13.067377 4760 scope.go:117] "RemoveContainer" containerID="ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" Sep 30 08:23:13 crc kubenswrapper[4760]: E0930 08:23:13.068365 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:23:24 crc kubenswrapper[4760]: I0930 08:23:24.067050 4760 scope.go:117] "RemoveContainer" containerID="ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" Sep 30 08:23:24 crc kubenswrapper[4760]: E0930 08:23:24.068504 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:23:35 crc kubenswrapper[4760]: I0930 08:23:35.076913 4760 scope.go:117] "RemoveContainer" containerID="ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" Sep 30 08:23:35 crc kubenswrapper[4760]: E0930 08:23:35.077814 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:23:46 crc kubenswrapper[4760]: I0930 08:23:46.067928 4760 scope.go:117] "RemoveContainer" containerID="ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" Sep 30 08:23:46 crc kubenswrapper[4760]: E0930 08:23:46.069193 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:23:46 crc kubenswrapper[4760]: I0930 08:23:46.843396 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q9hpc"] Sep 30 08:23:46 crc kubenswrapper[4760]: E0930 08:23:46.843872 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62eb1141-74f4-4c8f-ab84-c9ed65e4bc12" containerName="extract-content" Sep 30 08:23:46 crc kubenswrapper[4760]: I0930 08:23:46.843887 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="62eb1141-74f4-4c8f-ab84-c9ed65e4bc12" containerName="extract-content" Sep 30 08:23:46 crc kubenswrapper[4760]: E0930 08:23:46.843915 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62eb1141-74f4-4c8f-ab84-c9ed65e4bc12" containerName="extract-utilities" Sep 30 08:23:46 crc kubenswrapper[4760]: I0930 08:23:46.843924 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="62eb1141-74f4-4c8f-ab84-c9ed65e4bc12" containerName="extract-utilities" Sep 30 08:23:46 crc kubenswrapper[4760]: E0930 08:23:46.843955 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62eb1141-74f4-4c8f-ab84-c9ed65e4bc12" containerName="registry-server" Sep 30 08:23:46 crc kubenswrapper[4760]: I0930 08:23:46.843963 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="62eb1141-74f4-4c8f-ab84-c9ed65e4bc12" containerName="registry-server" Sep 30 08:23:46 crc kubenswrapper[4760]: I0930 08:23:46.844368 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="62eb1141-74f4-4c8f-ab84-c9ed65e4bc12" containerName="registry-server" Sep 30 08:23:46 crc kubenswrapper[4760]: I0930 08:23:46.846253 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q9hpc" Sep 30 08:23:46 crc kubenswrapper[4760]: I0930 08:23:46.867569 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q9hpc"] Sep 30 08:23:46 crc kubenswrapper[4760]: I0930 08:23:46.973358 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a58739c-607f-45e6-981f-d0ca367562a8-utilities\") pod \"community-operators-q9hpc\" (UID: \"8a58739c-607f-45e6-981f-d0ca367562a8\") " pod="openshift-marketplace/community-operators-q9hpc" Sep 30 08:23:46 crc kubenswrapper[4760]: I0930 08:23:46.973754 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bktdb\" (UniqueName: \"kubernetes.io/projected/8a58739c-607f-45e6-981f-d0ca367562a8-kube-api-access-bktdb\") pod \"community-operators-q9hpc\" (UID: \"8a58739c-607f-45e6-981f-d0ca367562a8\") " pod="openshift-marketplace/community-operators-q9hpc" Sep 30 08:23:46 crc kubenswrapper[4760]: I0930 08:23:46.973982 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a58739c-607f-45e6-981f-d0ca367562a8-catalog-content\") pod \"community-operators-q9hpc\" (UID: \"8a58739c-607f-45e6-981f-d0ca367562a8\") " pod="openshift-marketplace/community-operators-q9hpc" Sep 30 08:23:47 crc kubenswrapper[4760]: I0930 08:23:47.075414 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a58739c-607f-45e6-981f-d0ca367562a8-utilities\") pod \"community-operators-q9hpc\" (UID: \"8a58739c-607f-45e6-981f-d0ca367562a8\") " pod="openshift-marketplace/community-operators-q9hpc" Sep 30 08:23:47 crc kubenswrapper[4760]: I0930 08:23:47.076151 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bktdb\" (UniqueName: \"kubernetes.io/projected/8a58739c-607f-45e6-981f-d0ca367562a8-kube-api-access-bktdb\") pod \"community-operators-q9hpc\" (UID: \"8a58739c-607f-45e6-981f-d0ca367562a8\") " pod="openshift-marketplace/community-operators-q9hpc" Sep 30 08:23:47 crc kubenswrapper[4760]: I0930 08:23:47.075907 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a58739c-607f-45e6-981f-d0ca367562a8-utilities\") pod \"community-operators-q9hpc\" (UID: \"8a58739c-607f-45e6-981f-d0ca367562a8\") " pod="openshift-marketplace/community-operators-q9hpc" Sep 30 08:23:47 crc kubenswrapper[4760]: I0930 08:23:47.076320 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a58739c-607f-45e6-981f-d0ca367562a8-catalog-content\") pod \"community-operators-q9hpc\" (UID: \"8a58739c-607f-45e6-981f-d0ca367562a8\") " pod="openshift-marketplace/community-operators-q9hpc" Sep 30 08:23:47 crc kubenswrapper[4760]: I0930 08:23:47.076800 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a58739c-607f-45e6-981f-d0ca367562a8-catalog-content\") pod \"community-operators-q9hpc\" (UID: \"8a58739c-607f-45e6-981f-d0ca367562a8\") " pod="openshift-marketplace/community-operators-q9hpc" Sep 30 08:23:47 crc kubenswrapper[4760]: I0930 08:23:47.101382 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bktdb\" (UniqueName: \"kubernetes.io/projected/8a58739c-607f-45e6-981f-d0ca367562a8-kube-api-access-bktdb\") pod \"community-operators-q9hpc\" (UID: \"8a58739c-607f-45e6-981f-d0ca367562a8\") " pod="openshift-marketplace/community-operators-q9hpc" Sep 30 08:23:47 crc kubenswrapper[4760]: I0930 08:23:47.197220 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q9hpc" Sep 30 08:23:47 crc kubenswrapper[4760]: I0930 08:23:47.678422 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q9hpc"] Sep 30 08:23:47 crc kubenswrapper[4760]: W0930 08:23:47.685120 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a58739c_607f_45e6_981f_d0ca367562a8.slice/crio-8529afbf812d8aef1f8b31d7412ef25196ce6742ac1c5448816651329f7341bd WatchSource:0}: Error finding container 8529afbf812d8aef1f8b31d7412ef25196ce6742ac1c5448816651329f7341bd: Status 404 returned error can't find the container with id 8529afbf812d8aef1f8b31d7412ef25196ce6742ac1c5448816651329f7341bd Sep 30 08:23:47 crc kubenswrapper[4760]: I0930 08:23:47.948488 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9hpc" event={"ID":"8a58739c-607f-45e6-981f-d0ca367562a8","Type":"ContainerStarted","Data":"ea6b87112d6a52eb4686e16051024927d500946de48b9087f2022d071579b67b"} Sep 30 08:23:47 crc kubenswrapper[4760]: I0930 08:23:47.948552 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9hpc" event={"ID":"8a58739c-607f-45e6-981f-d0ca367562a8","Type":"ContainerStarted","Data":"8529afbf812d8aef1f8b31d7412ef25196ce6742ac1c5448816651329f7341bd"} Sep 30 08:23:48 crc kubenswrapper[4760]: I0930 08:23:48.965298 4760 generic.go:334] "Generic (PLEG): container finished" podID="8a58739c-607f-45e6-981f-d0ca367562a8" containerID="ea6b87112d6a52eb4686e16051024927d500946de48b9087f2022d071579b67b" exitCode=0 Sep 30 08:23:48 crc kubenswrapper[4760]: I0930 08:23:48.965510 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9hpc" event={"ID":"8a58739c-607f-45e6-981f-d0ca367562a8","Type":"ContainerDied","Data":"ea6b87112d6a52eb4686e16051024927d500946de48b9087f2022d071579b67b"} Sep 30 08:23:49 crc kubenswrapper[4760]: I0930 08:23:49.982142 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9hpc" event={"ID":"8a58739c-607f-45e6-981f-d0ca367562a8","Type":"ContainerStarted","Data":"2a9187d5693ceec7a2d7a812a5b4eb1e0d5f63a554a1151e52affa7ac0e44f22"} Sep 30 08:23:53 crc kubenswrapper[4760]: I0930 08:23:53.013637 4760 generic.go:334] "Generic (PLEG): container finished" podID="8a58739c-607f-45e6-981f-d0ca367562a8" containerID="2a9187d5693ceec7a2d7a812a5b4eb1e0d5f63a554a1151e52affa7ac0e44f22" exitCode=0 Sep 30 08:23:53 crc kubenswrapper[4760]: I0930 08:23:53.013732 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9hpc" event={"ID":"8a58739c-607f-45e6-981f-d0ca367562a8","Type":"ContainerDied","Data":"2a9187d5693ceec7a2d7a812a5b4eb1e0d5f63a554a1151e52affa7ac0e44f22"} Sep 30 08:23:54 crc kubenswrapper[4760]: I0930 08:23:54.025331 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9hpc" event={"ID":"8a58739c-607f-45e6-981f-d0ca367562a8","Type":"ContainerStarted","Data":"24e30e8f60be88969a621351a5d92e451973947e2e416fbe3ebbd2e16a89edfe"} Sep 30 08:23:55 crc kubenswrapper[4760]: I0930 08:23:55.128510 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q9hpc" podStartSLOduration=4.639270716 podStartE2EDuration="9.128493231s" podCreationTimestamp="2025-09-30 08:23:46 +0000 UTC" firstStartedPulling="2025-09-30 08:23:48.968736281 +0000 UTC m=+3014.611642713" lastFinishedPulling="2025-09-30 08:23:53.457958816 +0000 UTC m=+3019.100865228" observedRunningTime="2025-09-30 08:23:54.04769804 +0000 UTC m=+3019.690604452" watchObservedRunningTime="2025-09-30 08:23:55.128493231 +0000 UTC m=+3020.771399643" Sep 30 08:23:55 crc kubenswrapper[4760]: I0930 08:23:55.131729 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hdm9c"] Sep 30 08:23:55 crc kubenswrapper[4760]: I0930 08:23:55.133590 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdm9c" Sep 30 08:23:55 crc kubenswrapper[4760]: I0930 08:23:55.147166 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hdm9c"] Sep 30 08:23:55 crc kubenswrapper[4760]: I0930 08:23:55.243666 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpfwd\" (UniqueName: \"kubernetes.io/projected/bd543cbe-b908-4e94-a7cb-6860bea4536e-kube-api-access-xpfwd\") pod \"certified-operators-hdm9c\" (UID: \"bd543cbe-b908-4e94-a7cb-6860bea4536e\") " pod="openshift-marketplace/certified-operators-hdm9c" Sep 30 08:23:55 crc kubenswrapper[4760]: I0930 08:23:55.243809 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd543cbe-b908-4e94-a7cb-6860bea4536e-catalog-content\") pod \"certified-operators-hdm9c\" (UID: \"bd543cbe-b908-4e94-a7cb-6860bea4536e\") " pod="openshift-marketplace/certified-operators-hdm9c" Sep 30 08:23:55 crc kubenswrapper[4760]: I0930 08:23:55.243891 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd543cbe-b908-4e94-a7cb-6860bea4536e-utilities\") pod \"certified-operators-hdm9c\" (UID: \"bd543cbe-b908-4e94-a7cb-6860bea4536e\") " pod="openshift-marketplace/certified-operators-hdm9c" Sep 30 08:23:55 crc kubenswrapper[4760]: I0930 08:23:55.346977 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd543cbe-b908-4e94-a7cb-6860bea4536e-catalog-content\") pod \"certified-operators-hdm9c\" (UID: \"bd543cbe-b908-4e94-a7cb-6860bea4536e\") " pod="openshift-marketplace/certified-operators-hdm9c" Sep 30 08:23:55 crc kubenswrapper[4760]: I0930 08:23:55.347164 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd543cbe-b908-4e94-a7cb-6860bea4536e-utilities\") pod \"certified-operators-hdm9c\" (UID: \"bd543cbe-b908-4e94-a7cb-6860bea4536e\") " pod="openshift-marketplace/certified-operators-hdm9c" Sep 30 08:23:55 crc kubenswrapper[4760]: I0930 08:23:55.347334 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpfwd\" (UniqueName: \"kubernetes.io/projected/bd543cbe-b908-4e94-a7cb-6860bea4536e-kube-api-access-xpfwd\") pod \"certified-operators-hdm9c\" (UID: \"bd543cbe-b908-4e94-a7cb-6860bea4536e\") " pod="openshift-marketplace/certified-operators-hdm9c" Sep 30 08:23:55 crc kubenswrapper[4760]: I0930 08:23:55.347650 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd543cbe-b908-4e94-a7cb-6860bea4536e-utilities\") pod \"certified-operators-hdm9c\" (UID: \"bd543cbe-b908-4e94-a7cb-6860bea4536e\") " pod="openshift-marketplace/certified-operators-hdm9c" Sep 30 08:23:55 crc kubenswrapper[4760]: I0930 08:23:55.347641 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd543cbe-b908-4e94-a7cb-6860bea4536e-catalog-content\") pod \"certified-operators-hdm9c\" (UID: \"bd543cbe-b908-4e94-a7cb-6860bea4536e\") " pod="openshift-marketplace/certified-operators-hdm9c" Sep 30 08:23:55 crc kubenswrapper[4760]: I0930 08:23:55.373369 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpfwd\" (UniqueName: \"kubernetes.io/projected/bd543cbe-b908-4e94-a7cb-6860bea4536e-kube-api-access-xpfwd\") pod \"certified-operators-hdm9c\" (UID: \"bd543cbe-b908-4e94-a7cb-6860bea4536e\") " pod="openshift-marketplace/certified-operators-hdm9c" Sep 30 08:23:55 crc kubenswrapper[4760]: I0930 08:23:55.465878 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdm9c" Sep 30 08:23:55 crc kubenswrapper[4760]: W0930 08:23:55.983140 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd543cbe_b908_4e94_a7cb_6860bea4536e.slice/crio-cda1eaa220399d461552a04937e22199893d8ac6b9a84381a211cd6fd07c65f5 WatchSource:0}: Error finding container cda1eaa220399d461552a04937e22199893d8ac6b9a84381a211cd6fd07c65f5: Status 404 returned error can't find the container with id cda1eaa220399d461552a04937e22199893d8ac6b9a84381a211cd6fd07c65f5 Sep 30 08:23:55 crc kubenswrapper[4760]: I0930 08:23:55.988339 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hdm9c"] Sep 30 08:23:56 crc kubenswrapper[4760]: I0930 08:23:56.043738 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdm9c" event={"ID":"bd543cbe-b908-4e94-a7cb-6860bea4536e","Type":"ContainerStarted","Data":"cda1eaa220399d461552a04937e22199893d8ac6b9a84381a211cd6fd07c65f5"} Sep 30 08:23:57 crc kubenswrapper[4760]: I0930 08:23:57.062278 4760 generic.go:334] "Generic (PLEG): container finished" podID="bd543cbe-b908-4e94-a7cb-6860bea4536e" containerID="d54d747b710362b7fb30c0722f513d25ef208975580784de72dffe5ccaa0c1eb" exitCode=0 Sep 30 08:23:57 crc kubenswrapper[4760]: I0930 08:23:57.062338 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdm9c" event={"ID":"bd543cbe-b908-4e94-a7cb-6860bea4536e","Type":"ContainerDied","Data":"d54d747b710362b7fb30c0722f513d25ef208975580784de72dffe5ccaa0c1eb"} Sep 30 08:23:57 crc kubenswrapper[4760]: I0930 08:23:57.198557 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q9hpc" Sep 30 08:23:57 crc kubenswrapper[4760]: I0930 08:23:57.198920 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q9hpc" Sep 30 08:23:57 crc kubenswrapper[4760]: I0930 08:23:57.268712 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q9hpc" Sep 30 08:23:58 crc kubenswrapper[4760]: I0930 08:23:58.130806 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q9hpc" Sep 30 08:23:59 crc kubenswrapper[4760]: I0930 08:23:59.085411 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdm9c" event={"ID":"bd543cbe-b908-4e94-a7cb-6860bea4536e","Type":"ContainerStarted","Data":"c94f5faf97a5cdfa73321e0371bc33175495941503ddce57d1f847aa97f976b2"} Sep 30 08:23:59 crc kubenswrapper[4760]: I0930 08:23:59.505461 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q9hpc"] Sep 30 08:24:00 crc kubenswrapper[4760]: I0930 08:24:00.067267 4760 scope.go:117] "RemoveContainer" containerID="ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" Sep 30 08:24:00 crc kubenswrapper[4760]: E0930 08:24:00.067886 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:24:00 crc kubenswrapper[4760]: I0930 08:24:00.098907 4760 generic.go:334] "Generic (PLEG): container finished" podID="bd543cbe-b908-4e94-a7cb-6860bea4536e" containerID="c94f5faf97a5cdfa73321e0371bc33175495941503ddce57d1f847aa97f976b2" exitCode=0 Sep 30 08:24:00 crc kubenswrapper[4760]: I0930 08:24:00.099181 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q9hpc" podUID="8a58739c-607f-45e6-981f-d0ca367562a8" containerName="registry-server" containerID="cri-o://24e30e8f60be88969a621351a5d92e451973947e2e416fbe3ebbd2e16a89edfe" gracePeriod=2 Sep 30 08:24:00 crc kubenswrapper[4760]: I0930 08:24:00.100173 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdm9c" event={"ID":"bd543cbe-b908-4e94-a7cb-6860bea4536e","Type":"ContainerDied","Data":"c94f5faf97a5cdfa73321e0371bc33175495941503ddce57d1f847aa97f976b2"} Sep 30 08:24:01 crc kubenswrapper[4760]: I0930 08:24:01.093652 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q9hpc" Sep 30 08:24:01 crc kubenswrapper[4760]: I0930 08:24:01.109620 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdm9c" event={"ID":"bd543cbe-b908-4e94-a7cb-6860bea4536e","Type":"ContainerStarted","Data":"7f46c066ad400ff1038ad6fac1f3dbb5ca8892466c2ecc2abac42bea78f9af28"} Sep 30 08:24:01 crc kubenswrapper[4760]: I0930 08:24:01.114033 4760 generic.go:334] "Generic (PLEG): container finished" podID="8a58739c-607f-45e6-981f-d0ca367562a8" containerID="24e30e8f60be88969a621351a5d92e451973947e2e416fbe3ebbd2e16a89edfe" exitCode=0 Sep 30 08:24:01 crc kubenswrapper[4760]: I0930 08:24:01.114074 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9hpc" event={"ID":"8a58739c-607f-45e6-981f-d0ca367562a8","Type":"ContainerDied","Data":"24e30e8f60be88969a621351a5d92e451973947e2e416fbe3ebbd2e16a89edfe"} Sep 30 08:24:01 crc kubenswrapper[4760]: I0930 08:24:01.114100 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9hpc" event={"ID":"8a58739c-607f-45e6-981f-d0ca367562a8","Type":"ContainerDied","Data":"8529afbf812d8aef1f8b31d7412ef25196ce6742ac1c5448816651329f7341bd"} Sep 30 08:24:01 crc kubenswrapper[4760]: I0930 08:24:01.114117 4760 scope.go:117] "RemoveContainer" containerID="24e30e8f60be88969a621351a5d92e451973947e2e416fbe3ebbd2e16a89edfe" Sep 30 08:24:01 crc kubenswrapper[4760]: I0930 08:24:01.114366 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q9hpc" Sep 30 08:24:01 crc kubenswrapper[4760]: I0930 08:24:01.143320 4760 scope.go:117] "RemoveContainer" containerID="2a9187d5693ceec7a2d7a812a5b4eb1e0d5f63a554a1151e52affa7ac0e44f22" Sep 30 08:24:01 crc kubenswrapper[4760]: I0930 08:24:01.173444 4760 scope.go:117] "RemoveContainer" containerID="ea6b87112d6a52eb4686e16051024927d500946de48b9087f2022d071579b67b" Sep 30 08:24:01 crc kubenswrapper[4760]: I0930 08:24:01.177693 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a58739c-607f-45e6-981f-d0ca367562a8-utilities\") pod \"8a58739c-607f-45e6-981f-d0ca367562a8\" (UID: \"8a58739c-607f-45e6-981f-d0ca367562a8\") " Sep 30 08:24:01 crc kubenswrapper[4760]: I0930 08:24:01.178150 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bktdb\" (UniqueName: \"kubernetes.io/projected/8a58739c-607f-45e6-981f-d0ca367562a8-kube-api-access-bktdb\") pod \"8a58739c-607f-45e6-981f-d0ca367562a8\" (UID: \"8a58739c-607f-45e6-981f-d0ca367562a8\") " Sep 30 08:24:01 crc kubenswrapper[4760]: I0930 08:24:01.178222 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a58739c-607f-45e6-981f-d0ca367562a8-catalog-content\") pod \"8a58739c-607f-45e6-981f-d0ca367562a8\" (UID: \"8a58739c-607f-45e6-981f-d0ca367562a8\") " Sep 30 08:24:01 crc kubenswrapper[4760]: I0930 08:24:01.178891 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a58739c-607f-45e6-981f-d0ca367562a8-utilities" (OuterVolumeSpecName: "utilities") pod "8a58739c-607f-45e6-981f-d0ca367562a8" (UID: "8a58739c-607f-45e6-981f-d0ca367562a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:24:01 crc kubenswrapper[4760]: I0930 08:24:01.184245 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a58739c-607f-45e6-981f-d0ca367562a8-kube-api-access-bktdb" (OuterVolumeSpecName: "kube-api-access-bktdb") pod "8a58739c-607f-45e6-981f-d0ca367562a8" (UID: "8a58739c-607f-45e6-981f-d0ca367562a8"). InnerVolumeSpecName "kube-api-access-bktdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:24:01 crc kubenswrapper[4760]: I0930 08:24:01.223030 4760 scope.go:117] "RemoveContainer" containerID="24e30e8f60be88969a621351a5d92e451973947e2e416fbe3ebbd2e16a89edfe" Sep 30 08:24:01 crc kubenswrapper[4760]: I0930 08:24:01.228772 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a58739c-607f-45e6-981f-d0ca367562a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a58739c-607f-45e6-981f-d0ca367562a8" (UID: "8a58739c-607f-45e6-981f-d0ca367562a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:24:01 crc kubenswrapper[4760]: E0930 08:24:01.230206 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24e30e8f60be88969a621351a5d92e451973947e2e416fbe3ebbd2e16a89edfe\": container with ID starting with 24e30e8f60be88969a621351a5d92e451973947e2e416fbe3ebbd2e16a89edfe not found: ID does not exist" containerID="24e30e8f60be88969a621351a5d92e451973947e2e416fbe3ebbd2e16a89edfe" Sep 30 08:24:01 crc kubenswrapper[4760]: I0930 08:24:01.230245 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24e30e8f60be88969a621351a5d92e451973947e2e416fbe3ebbd2e16a89edfe"} err="failed to get container status \"24e30e8f60be88969a621351a5d92e451973947e2e416fbe3ebbd2e16a89edfe\": rpc error: code = NotFound desc = could not find container \"24e30e8f60be88969a621351a5d92e451973947e2e416fbe3ebbd2e16a89edfe\": container with ID starting with 24e30e8f60be88969a621351a5d92e451973947e2e416fbe3ebbd2e16a89edfe not found: ID does not exist" Sep 30 08:24:01 crc kubenswrapper[4760]: I0930 08:24:01.230266 4760 scope.go:117] "RemoveContainer" containerID="2a9187d5693ceec7a2d7a812a5b4eb1e0d5f63a554a1151e52affa7ac0e44f22" Sep 30 08:24:01 crc kubenswrapper[4760]: E0930 08:24:01.231073 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a9187d5693ceec7a2d7a812a5b4eb1e0d5f63a554a1151e52affa7ac0e44f22\": container with ID starting with 2a9187d5693ceec7a2d7a812a5b4eb1e0d5f63a554a1151e52affa7ac0e44f22 not found: ID does not exist" containerID="2a9187d5693ceec7a2d7a812a5b4eb1e0d5f63a554a1151e52affa7ac0e44f22" Sep 30 08:24:01 crc kubenswrapper[4760]: I0930 08:24:01.231123 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a9187d5693ceec7a2d7a812a5b4eb1e0d5f63a554a1151e52affa7ac0e44f22"} err="failed to get container status \"2a9187d5693ceec7a2d7a812a5b4eb1e0d5f63a554a1151e52affa7ac0e44f22\": rpc error: code = NotFound desc = could not find container \"2a9187d5693ceec7a2d7a812a5b4eb1e0d5f63a554a1151e52affa7ac0e44f22\": container with ID starting with 2a9187d5693ceec7a2d7a812a5b4eb1e0d5f63a554a1151e52affa7ac0e44f22 not found: ID does not exist" Sep 30 08:24:01 crc kubenswrapper[4760]: I0930 08:24:01.231158 4760 scope.go:117] "RemoveContainer" containerID="ea6b87112d6a52eb4686e16051024927d500946de48b9087f2022d071579b67b" Sep 30 08:24:01 crc kubenswrapper[4760]: E0930 08:24:01.231611 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea6b87112d6a52eb4686e16051024927d500946de48b9087f2022d071579b67b\": container with ID starting with ea6b87112d6a52eb4686e16051024927d500946de48b9087f2022d071579b67b not found: ID does not exist" containerID="ea6b87112d6a52eb4686e16051024927d500946de48b9087f2022d071579b67b" Sep 30 08:24:01 crc kubenswrapper[4760]: I0930 08:24:01.231635 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea6b87112d6a52eb4686e16051024927d500946de48b9087f2022d071579b67b"} err="failed to get container status \"ea6b87112d6a52eb4686e16051024927d500946de48b9087f2022d071579b67b\": rpc error: code = NotFound desc = could not find container \"ea6b87112d6a52eb4686e16051024927d500946de48b9087f2022d071579b67b\": container with ID starting with ea6b87112d6a52eb4686e16051024927d500946de48b9087f2022d071579b67b not found: ID does not exist" Sep 30 08:24:01 crc kubenswrapper[4760]: I0930 08:24:01.280231 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bktdb\" (UniqueName: \"kubernetes.io/projected/8a58739c-607f-45e6-981f-d0ca367562a8-kube-api-access-bktdb\") on node \"crc\" DevicePath \"\"" Sep 30 08:24:01 crc kubenswrapper[4760]: I0930 08:24:01.280266 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a58739c-607f-45e6-981f-d0ca367562a8-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 08:24:01 crc kubenswrapper[4760]: I0930 08:24:01.280275 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a58739c-607f-45e6-981f-d0ca367562a8-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 08:24:01 crc kubenswrapper[4760]: I0930 08:24:01.445910 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hdm9c" podStartSLOduration=3.023887995 podStartE2EDuration="6.445892306s" podCreationTimestamp="2025-09-30 08:23:55 +0000 UTC" firstStartedPulling="2025-09-30 08:23:57.064454679 +0000 UTC m=+3022.707361111" lastFinishedPulling="2025-09-30 08:24:00.48645901 +0000 UTC m=+3026.129365422" observedRunningTime="2025-09-30 08:24:01.135265556 +0000 UTC m=+3026.778171978" watchObservedRunningTime="2025-09-30 08:24:01.445892306 +0000 UTC m=+3027.088798718" Sep 30 08:24:01 crc kubenswrapper[4760]: I0930 08:24:01.448907 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q9hpc"] Sep 30 08:24:01 crc kubenswrapper[4760]: I0930 08:24:01.457079 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q9hpc"] Sep 30 08:24:03 crc kubenswrapper[4760]: I0930 08:24:03.087865 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a58739c-607f-45e6-981f-d0ca367562a8" path="/var/lib/kubelet/pods/8a58739c-607f-45e6-981f-d0ca367562a8/volumes" Sep 30 08:24:05 crc kubenswrapper[4760]: I0930 08:24:05.466232 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hdm9c" Sep 30 08:24:05 crc kubenswrapper[4760]: I0930 08:24:05.466918 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hdm9c" Sep 30 08:24:05 crc kubenswrapper[4760]: I0930 08:24:05.520734 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hdm9c" Sep 30 08:24:06 crc kubenswrapper[4760]: I0930 08:24:06.268857 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hdm9c" Sep 30 08:24:06 crc kubenswrapper[4760]: I0930 08:24:06.321604 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hdm9c"] Sep 30 08:24:08 crc kubenswrapper[4760]: I0930 08:24:08.221503 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hdm9c" podUID="bd543cbe-b908-4e94-a7cb-6860bea4536e" containerName="registry-server" containerID="cri-o://7f46c066ad400ff1038ad6fac1f3dbb5ca8892466c2ecc2abac42bea78f9af28" gracePeriod=2 Sep 30 08:24:09 crc kubenswrapper[4760]: I0930 08:24:09.238207 4760 generic.go:334] "Generic (PLEG): container finished" podID="bd543cbe-b908-4e94-a7cb-6860bea4536e" containerID="7f46c066ad400ff1038ad6fac1f3dbb5ca8892466c2ecc2abac42bea78f9af28" exitCode=0 Sep 30 08:24:09 crc kubenswrapper[4760]: I0930 08:24:09.238438 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdm9c" event={"ID":"bd543cbe-b908-4e94-a7cb-6860bea4536e","Type":"ContainerDied","Data":"7f46c066ad400ff1038ad6fac1f3dbb5ca8892466c2ecc2abac42bea78f9af28"} Sep 30 08:24:09 crc kubenswrapper[4760]: I0930 08:24:09.396565 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdm9c" Sep 30 08:24:09 crc kubenswrapper[4760]: I0930 08:24:09.565886 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpfwd\" (UniqueName: \"kubernetes.io/projected/bd543cbe-b908-4e94-a7cb-6860bea4536e-kube-api-access-xpfwd\") pod \"bd543cbe-b908-4e94-a7cb-6860bea4536e\" (UID: \"bd543cbe-b908-4e94-a7cb-6860bea4536e\") " Sep 30 08:24:09 crc kubenswrapper[4760]: I0930 08:24:09.566102 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd543cbe-b908-4e94-a7cb-6860bea4536e-utilities\") pod \"bd543cbe-b908-4e94-a7cb-6860bea4536e\" (UID: \"bd543cbe-b908-4e94-a7cb-6860bea4536e\") " Sep 30 08:24:09 crc kubenswrapper[4760]: I0930 08:24:09.566181 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd543cbe-b908-4e94-a7cb-6860bea4536e-catalog-content\") pod \"bd543cbe-b908-4e94-a7cb-6860bea4536e\" (UID: \"bd543cbe-b908-4e94-a7cb-6860bea4536e\") " Sep 30 08:24:09 crc kubenswrapper[4760]: I0930 08:24:09.567080 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd543cbe-b908-4e94-a7cb-6860bea4536e-utilities" (OuterVolumeSpecName: "utilities") pod "bd543cbe-b908-4e94-a7cb-6860bea4536e" (UID: "bd543cbe-b908-4e94-a7cb-6860bea4536e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:24:09 crc kubenswrapper[4760]: I0930 08:24:09.572520 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd543cbe-b908-4e94-a7cb-6860bea4536e-kube-api-access-xpfwd" (OuterVolumeSpecName: "kube-api-access-xpfwd") pod "bd543cbe-b908-4e94-a7cb-6860bea4536e" (UID: "bd543cbe-b908-4e94-a7cb-6860bea4536e"). InnerVolumeSpecName "kube-api-access-xpfwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:24:09 crc kubenswrapper[4760]: I0930 08:24:09.619615 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd543cbe-b908-4e94-a7cb-6860bea4536e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd543cbe-b908-4e94-a7cb-6860bea4536e" (UID: "bd543cbe-b908-4e94-a7cb-6860bea4536e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:24:09 crc kubenswrapper[4760]: I0930 08:24:09.669052 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd543cbe-b908-4e94-a7cb-6860bea4536e-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 08:24:09 crc kubenswrapper[4760]: I0930 08:24:09.669090 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd543cbe-b908-4e94-a7cb-6860bea4536e-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 08:24:09 crc kubenswrapper[4760]: I0930 08:24:09.669127 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpfwd\" (UniqueName: \"kubernetes.io/projected/bd543cbe-b908-4e94-a7cb-6860bea4536e-kube-api-access-xpfwd\") on node \"crc\" DevicePath \"\"" Sep 30 08:24:10 crc kubenswrapper[4760]: I0930 08:24:10.247795 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdm9c" event={"ID":"bd543cbe-b908-4e94-a7cb-6860bea4536e","Type":"ContainerDied","Data":"cda1eaa220399d461552a04937e22199893d8ac6b9a84381a211cd6fd07c65f5"} Sep 30 08:24:10 crc kubenswrapper[4760]: I0930 08:24:10.248113 4760 scope.go:117] "RemoveContainer" containerID="7f46c066ad400ff1038ad6fac1f3dbb5ca8892466c2ecc2abac42bea78f9af28" Sep 30 08:24:10 crc kubenswrapper[4760]: I0930 08:24:10.247855 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdm9c" Sep 30 08:24:10 crc kubenswrapper[4760]: I0930 08:24:10.269765 4760 scope.go:117] "RemoveContainer" containerID="c94f5faf97a5cdfa73321e0371bc33175495941503ddce57d1f847aa97f976b2" Sep 30 08:24:10 crc kubenswrapper[4760]: I0930 08:24:10.277797 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hdm9c"] Sep 30 08:24:10 crc kubenswrapper[4760]: I0930 08:24:10.286372 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hdm9c"] Sep 30 08:24:10 crc kubenswrapper[4760]: I0930 08:24:10.309711 4760 scope.go:117] "RemoveContainer" containerID="d54d747b710362b7fb30c0722f513d25ef208975580784de72dffe5ccaa0c1eb" Sep 30 08:24:11 crc kubenswrapper[4760]: I0930 08:24:11.086867 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd543cbe-b908-4e94-a7cb-6860bea4536e" path="/var/lib/kubelet/pods/bd543cbe-b908-4e94-a7cb-6860bea4536e/volumes" Sep 30 08:24:13 crc kubenswrapper[4760]: I0930 08:24:13.073639 4760 scope.go:117] "RemoveContainer" containerID="ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" Sep 30 08:24:13 crc kubenswrapper[4760]: E0930 08:24:13.074482 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:24:24 crc kubenswrapper[4760]: I0930 08:24:24.067399 4760 scope.go:117] "RemoveContainer" containerID="ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" Sep 30 08:24:24 crc kubenswrapper[4760]: E0930 08:24:24.068449 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:24:37 crc kubenswrapper[4760]: I0930 08:24:37.067423 4760 scope.go:117] "RemoveContainer" containerID="ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" Sep 30 08:24:37 crc kubenswrapper[4760]: E0930 08:24:37.068150 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:24:51 crc kubenswrapper[4760]: I0930 08:24:51.067396 4760 scope.go:117] "RemoveContainer" containerID="ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" Sep 30 08:24:51 crc kubenswrapper[4760]: E0930 08:24:51.068400 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:25:05 crc kubenswrapper[4760]: I0930 08:25:05.074198 4760 scope.go:117] "RemoveContainer" containerID="ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" Sep 30 08:25:05 crc kubenswrapper[4760]: E0930 08:25:05.075257 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:25:18 crc kubenswrapper[4760]: I0930 08:25:18.067684 4760 scope.go:117] "RemoveContainer" containerID="ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" Sep 30 08:25:18 crc kubenswrapper[4760]: E0930 08:25:18.068880 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:25:20 crc kubenswrapper[4760]: I0930 08:25:20.561996 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dmft9"] Sep 30 08:25:20 crc kubenswrapper[4760]: E0930 08:25:20.562878 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd543cbe-b908-4e94-a7cb-6860bea4536e" containerName="extract-content" Sep 30 08:25:20 crc kubenswrapper[4760]: I0930 08:25:20.562903 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd543cbe-b908-4e94-a7cb-6860bea4536e" containerName="extract-content" Sep 30 08:25:20 crc kubenswrapper[4760]: E0930 08:25:20.562933 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd543cbe-b908-4e94-a7cb-6860bea4536e" containerName="registry-server" Sep 30 08:25:20 crc kubenswrapper[4760]: I0930 08:25:20.562944 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd543cbe-b908-4e94-a7cb-6860bea4536e" containerName="registry-server" Sep 30 08:25:20 crc kubenswrapper[4760]: E0930 08:25:20.562959 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd543cbe-b908-4e94-a7cb-6860bea4536e" containerName="extract-utilities" Sep 30 08:25:20 crc kubenswrapper[4760]: I0930 08:25:20.562971 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd543cbe-b908-4e94-a7cb-6860bea4536e" containerName="extract-utilities" Sep 30 08:25:20 crc kubenswrapper[4760]: E0930 08:25:20.563015 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a58739c-607f-45e6-981f-d0ca367562a8" containerName="extract-content" Sep 30 08:25:20 crc kubenswrapper[4760]: I0930 08:25:20.563029 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a58739c-607f-45e6-981f-d0ca367562a8" containerName="extract-content" Sep 30 08:25:20 crc kubenswrapper[4760]: E0930 08:25:20.563049 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a58739c-607f-45e6-981f-d0ca367562a8" containerName="registry-server" Sep 30 08:25:20 crc kubenswrapper[4760]: I0930 08:25:20.563059 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a58739c-607f-45e6-981f-d0ca367562a8" containerName="registry-server" Sep 30 08:25:20 crc kubenswrapper[4760]: E0930 08:25:20.563086 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a58739c-607f-45e6-981f-d0ca367562a8" containerName="extract-utilities" Sep 30 08:25:20 crc kubenswrapper[4760]: I0930 08:25:20.563097 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a58739c-607f-45e6-981f-d0ca367562a8" containerName="extract-utilities" Sep 30 08:25:20 crc kubenswrapper[4760]: I0930 08:25:20.563469 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a58739c-607f-45e6-981f-d0ca367562a8" containerName="registry-server" Sep 30 08:25:20 crc kubenswrapper[4760]: I0930 08:25:20.563507 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd543cbe-b908-4e94-a7cb-6860bea4536e" containerName="registry-server" Sep 30 08:25:20 crc kubenswrapper[4760]: I0930 08:25:20.565804 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dmft9" Sep 30 08:25:20 crc kubenswrapper[4760]: I0930 08:25:20.579403 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dmft9"] Sep 30 08:25:20 crc kubenswrapper[4760]: I0930 08:25:20.728355 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqq4m\" (UniqueName: \"kubernetes.io/projected/923055e3-c184-49fd-a4c3-35f3e4f21e7f-kube-api-access-bqq4m\") pod \"redhat-marketplace-dmft9\" (UID: \"923055e3-c184-49fd-a4c3-35f3e4f21e7f\") " pod="openshift-marketplace/redhat-marketplace-dmft9" Sep 30 08:25:20 crc kubenswrapper[4760]: I0930 08:25:20.728751 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/923055e3-c184-49fd-a4c3-35f3e4f21e7f-utilities\") pod \"redhat-marketplace-dmft9\" (UID: \"923055e3-c184-49fd-a4c3-35f3e4f21e7f\") " pod="openshift-marketplace/redhat-marketplace-dmft9" Sep 30 08:25:20 crc kubenswrapper[4760]: I0930 08:25:20.728855 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/923055e3-c184-49fd-a4c3-35f3e4f21e7f-catalog-content\") pod \"redhat-marketplace-dmft9\" (UID: \"923055e3-c184-49fd-a4c3-35f3e4f21e7f\") " pod="openshift-marketplace/redhat-marketplace-dmft9" Sep 30 08:25:20 crc kubenswrapper[4760]: I0930 08:25:20.830376 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/923055e3-c184-49fd-a4c3-35f3e4f21e7f-catalog-content\") pod \"redhat-marketplace-dmft9\" (UID: \"923055e3-c184-49fd-a4c3-35f3e4f21e7f\") " pod="openshift-marketplace/redhat-marketplace-dmft9" Sep 30 08:25:20 crc kubenswrapper[4760]: I0930 08:25:20.830512 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqq4m\" (UniqueName: \"kubernetes.io/projected/923055e3-c184-49fd-a4c3-35f3e4f21e7f-kube-api-access-bqq4m\") pod \"redhat-marketplace-dmft9\" (UID: \"923055e3-c184-49fd-a4c3-35f3e4f21e7f\") " pod="openshift-marketplace/redhat-marketplace-dmft9" Sep 30 08:25:20 crc kubenswrapper[4760]: I0930 08:25:20.830586 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/923055e3-c184-49fd-a4c3-35f3e4f21e7f-utilities\") pod \"redhat-marketplace-dmft9\" (UID: \"923055e3-c184-49fd-a4c3-35f3e4f21e7f\") " pod="openshift-marketplace/redhat-marketplace-dmft9" Sep 30 08:25:20 crc kubenswrapper[4760]: I0930 08:25:20.830912 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/923055e3-c184-49fd-a4c3-35f3e4f21e7f-catalog-content\") pod \"redhat-marketplace-dmft9\" (UID: \"923055e3-c184-49fd-a4c3-35f3e4f21e7f\") " pod="openshift-marketplace/redhat-marketplace-dmft9" Sep 30 08:25:20 crc kubenswrapper[4760]: I0930 08:25:20.831005 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/923055e3-c184-49fd-a4c3-35f3e4f21e7f-utilities\") pod \"redhat-marketplace-dmft9\" (UID: \"923055e3-c184-49fd-a4c3-35f3e4f21e7f\") " pod="openshift-marketplace/redhat-marketplace-dmft9" Sep 30 08:25:20 crc kubenswrapper[4760]: I0930 08:25:20.853417 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqq4m\" (UniqueName: \"kubernetes.io/projected/923055e3-c184-49fd-a4c3-35f3e4f21e7f-kube-api-access-bqq4m\") pod \"redhat-marketplace-dmft9\" (UID: \"923055e3-c184-49fd-a4c3-35f3e4f21e7f\") " pod="openshift-marketplace/redhat-marketplace-dmft9" Sep 30 08:25:20 crc kubenswrapper[4760]: I0930 08:25:20.888421 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dmft9" Sep 30 08:25:21 crc kubenswrapper[4760]: I0930 08:25:21.339643 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dmft9"] Sep 30 08:25:22 crc kubenswrapper[4760]: I0930 08:25:22.091334 4760 generic.go:334] "Generic (PLEG): container finished" podID="923055e3-c184-49fd-a4c3-35f3e4f21e7f" containerID="cfd8ed88490937789ee4bb8809b236c14c7651b6f5842de938617828fa9140c7" exitCode=0 Sep 30 08:25:22 crc kubenswrapper[4760]: I0930 08:25:22.091478 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmft9" event={"ID":"923055e3-c184-49fd-a4c3-35f3e4f21e7f","Type":"ContainerDied","Data":"cfd8ed88490937789ee4bb8809b236c14c7651b6f5842de938617828fa9140c7"} Sep 30 08:25:22 crc kubenswrapper[4760]: I0930 08:25:22.092579 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmft9" event={"ID":"923055e3-c184-49fd-a4c3-35f3e4f21e7f","Type":"ContainerStarted","Data":"7422b298c85b435600b873ccbe7184be015578c0917f2eafafe7b95239cd6610"} Sep 30 08:25:23 crc kubenswrapper[4760]: I0930 08:25:23.107610 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmft9" event={"ID":"923055e3-c184-49fd-a4c3-35f3e4f21e7f","Type":"ContainerStarted","Data":"ef50217a4b81b05ed83faea9fd87b9eba5d203195342df862298887c142d140b"} Sep 30 08:25:24 crc kubenswrapper[4760]: I0930 08:25:24.123113 4760 generic.go:334] "Generic (PLEG): container finished" podID="923055e3-c184-49fd-a4c3-35f3e4f21e7f" containerID="ef50217a4b81b05ed83faea9fd87b9eba5d203195342df862298887c142d140b" exitCode=0 Sep 30 08:25:24 crc kubenswrapper[4760]: I0930 08:25:24.123377 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmft9" event={"ID":"923055e3-c184-49fd-a4c3-35f3e4f21e7f","Type":"ContainerDied","Data":"ef50217a4b81b05ed83faea9fd87b9eba5d203195342df862298887c142d140b"} Sep 30 08:25:25 crc kubenswrapper[4760]: I0930 08:25:25.136075 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmft9" event={"ID":"923055e3-c184-49fd-a4c3-35f3e4f21e7f","Type":"ContainerStarted","Data":"5a24ab59b59c6e635e421ca043d145af340644edb86e6e91152d06e218911506"} Sep 30 08:25:25 crc kubenswrapper[4760]: I0930 08:25:25.154596 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dmft9" podStartSLOduration=2.63295506 podStartE2EDuration="5.154576562s" podCreationTimestamp="2025-09-30 08:25:20 +0000 UTC" firstStartedPulling="2025-09-30 08:25:22.093407331 +0000 UTC m=+3107.736313743" lastFinishedPulling="2025-09-30 08:25:24.615028813 +0000 UTC m=+3110.257935245" observedRunningTime="2025-09-30 08:25:25.150050956 +0000 UTC m=+3110.792957388" watchObservedRunningTime="2025-09-30 08:25:25.154576562 +0000 UTC m=+3110.797482994" Sep 30 08:25:30 crc kubenswrapper[4760]: I0930 08:25:30.889238 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dmft9" Sep 30 08:25:30 crc kubenswrapper[4760]: I0930 08:25:30.889917 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dmft9" Sep 30 08:25:30 crc kubenswrapper[4760]: I0930 08:25:30.956022 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dmft9" Sep 30 08:25:31 crc kubenswrapper[4760]: I0930 08:25:31.272965 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dmft9" Sep 30 08:25:31 crc kubenswrapper[4760]: I0930 08:25:31.343228 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dmft9"] Sep 30 08:25:32 crc kubenswrapper[4760]: I0930 08:25:32.067419 4760 scope.go:117] "RemoveContainer" containerID="ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" Sep 30 08:25:32 crc kubenswrapper[4760]: E0930 08:25:32.068429 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:25:33 crc kubenswrapper[4760]: I0930 08:25:33.250742 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dmft9" podUID="923055e3-c184-49fd-a4c3-35f3e4f21e7f" containerName="registry-server" containerID="cri-o://5a24ab59b59c6e635e421ca043d145af340644edb86e6e91152d06e218911506" gracePeriod=2 Sep 30 08:25:33 crc kubenswrapper[4760]: I0930 08:25:33.854054 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dmft9" Sep 30 08:25:34 crc kubenswrapper[4760]: I0930 08:25:34.008470 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqq4m\" (UniqueName: \"kubernetes.io/projected/923055e3-c184-49fd-a4c3-35f3e4f21e7f-kube-api-access-bqq4m\") pod \"923055e3-c184-49fd-a4c3-35f3e4f21e7f\" (UID: \"923055e3-c184-49fd-a4c3-35f3e4f21e7f\") " Sep 30 08:25:34 crc kubenswrapper[4760]: I0930 08:25:34.008556 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/923055e3-c184-49fd-a4c3-35f3e4f21e7f-catalog-content\") pod \"923055e3-c184-49fd-a4c3-35f3e4f21e7f\" (UID: \"923055e3-c184-49fd-a4c3-35f3e4f21e7f\") " Sep 30 08:25:34 crc kubenswrapper[4760]: I0930 08:25:34.008655 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/923055e3-c184-49fd-a4c3-35f3e4f21e7f-utilities\") pod \"923055e3-c184-49fd-a4c3-35f3e4f21e7f\" (UID: \"923055e3-c184-49fd-a4c3-35f3e4f21e7f\") " Sep 30 08:25:34 crc kubenswrapper[4760]: I0930 08:25:34.009961 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/923055e3-c184-49fd-a4c3-35f3e4f21e7f-utilities" (OuterVolumeSpecName: "utilities") pod "923055e3-c184-49fd-a4c3-35f3e4f21e7f" (UID: "923055e3-c184-49fd-a4c3-35f3e4f21e7f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:25:34 crc kubenswrapper[4760]: I0930 08:25:34.015773 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/923055e3-c184-49fd-a4c3-35f3e4f21e7f-kube-api-access-bqq4m" (OuterVolumeSpecName: "kube-api-access-bqq4m") pod "923055e3-c184-49fd-a4c3-35f3e4f21e7f" (UID: "923055e3-c184-49fd-a4c3-35f3e4f21e7f"). InnerVolumeSpecName "kube-api-access-bqq4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:25:34 crc kubenswrapper[4760]: I0930 08:25:34.027438 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/923055e3-c184-49fd-a4c3-35f3e4f21e7f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "923055e3-c184-49fd-a4c3-35f3e4f21e7f" (UID: "923055e3-c184-49fd-a4c3-35f3e4f21e7f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:25:34 crc kubenswrapper[4760]: I0930 08:25:34.111702 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqq4m\" (UniqueName: \"kubernetes.io/projected/923055e3-c184-49fd-a4c3-35f3e4f21e7f-kube-api-access-bqq4m\") on node \"crc\" DevicePath \"\"" Sep 30 08:25:34 crc kubenswrapper[4760]: I0930 08:25:34.111759 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/923055e3-c184-49fd-a4c3-35f3e4f21e7f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 08:25:34 crc kubenswrapper[4760]: I0930 08:25:34.111780 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/923055e3-c184-49fd-a4c3-35f3e4f21e7f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 08:25:34 crc kubenswrapper[4760]: I0930 08:25:34.266952 4760 generic.go:334] "Generic (PLEG): container finished" podID="923055e3-c184-49fd-a4c3-35f3e4f21e7f" containerID="5a24ab59b59c6e635e421ca043d145af340644edb86e6e91152d06e218911506" exitCode=0 Sep 30 08:25:34 crc kubenswrapper[4760]: I0930 08:25:34.267053 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dmft9" Sep 30 08:25:34 crc kubenswrapper[4760]: I0930 08:25:34.267048 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmft9" event={"ID":"923055e3-c184-49fd-a4c3-35f3e4f21e7f","Type":"ContainerDied","Data":"5a24ab59b59c6e635e421ca043d145af340644edb86e6e91152d06e218911506"} Sep 30 08:25:34 crc kubenswrapper[4760]: I0930 08:25:34.268868 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmft9" event={"ID":"923055e3-c184-49fd-a4c3-35f3e4f21e7f","Type":"ContainerDied","Data":"7422b298c85b435600b873ccbe7184be015578c0917f2eafafe7b95239cd6610"} Sep 30 08:25:34 crc kubenswrapper[4760]: I0930 08:25:34.268902 4760 scope.go:117] "RemoveContainer" containerID="5a24ab59b59c6e635e421ca043d145af340644edb86e6e91152d06e218911506" Sep 30 08:25:34 crc kubenswrapper[4760]: I0930 08:25:34.304565 4760 scope.go:117] "RemoveContainer" containerID="ef50217a4b81b05ed83faea9fd87b9eba5d203195342df862298887c142d140b" Sep 30 08:25:34 crc kubenswrapper[4760]: I0930 08:25:34.314979 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dmft9"] Sep 30 08:25:34 crc kubenswrapper[4760]: I0930 08:25:34.334701 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dmft9"] Sep 30 08:25:34 crc kubenswrapper[4760]: I0930 08:25:34.343298 4760 scope.go:117] "RemoveContainer" containerID="cfd8ed88490937789ee4bb8809b236c14c7651b6f5842de938617828fa9140c7" Sep 30 08:25:34 crc kubenswrapper[4760]: I0930 08:25:34.387014 4760 scope.go:117] "RemoveContainer" containerID="5a24ab59b59c6e635e421ca043d145af340644edb86e6e91152d06e218911506" Sep 30 08:25:34 crc kubenswrapper[4760]: E0930 08:25:34.387684 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a24ab59b59c6e635e421ca043d145af340644edb86e6e91152d06e218911506\": container with ID starting with 5a24ab59b59c6e635e421ca043d145af340644edb86e6e91152d06e218911506 not found: ID does not exist" containerID="5a24ab59b59c6e635e421ca043d145af340644edb86e6e91152d06e218911506" Sep 30 08:25:34 crc kubenswrapper[4760]: I0930 08:25:34.387796 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a24ab59b59c6e635e421ca043d145af340644edb86e6e91152d06e218911506"} err="failed to get container status \"5a24ab59b59c6e635e421ca043d145af340644edb86e6e91152d06e218911506\": rpc error: code = NotFound desc = could not find container \"5a24ab59b59c6e635e421ca043d145af340644edb86e6e91152d06e218911506\": container with ID starting with 5a24ab59b59c6e635e421ca043d145af340644edb86e6e91152d06e218911506 not found: ID does not exist" Sep 30 08:25:34 crc kubenswrapper[4760]: I0930 08:25:34.387843 4760 scope.go:117] "RemoveContainer" containerID="ef50217a4b81b05ed83faea9fd87b9eba5d203195342df862298887c142d140b" Sep 30 08:25:34 crc kubenswrapper[4760]: E0930 08:25:34.388409 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef50217a4b81b05ed83faea9fd87b9eba5d203195342df862298887c142d140b\": container with ID starting with ef50217a4b81b05ed83faea9fd87b9eba5d203195342df862298887c142d140b not found: ID does not exist" containerID="ef50217a4b81b05ed83faea9fd87b9eba5d203195342df862298887c142d140b" Sep 30 08:25:34 crc kubenswrapper[4760]: I0930 08:25:34.388461 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef50217a4b81b05ed83faea9fd87b9eba5d203195342df862298887c142d140b"} err="failed to get container status \"ef50217a4b81b05ed83faea9fd87b9eba5d203195342df862298887c142d140b\": rpc error: code = NotFound desc = could not find container \"ef50217a4b81b05ed83faea9fd87b9eba5d203195342df862298887c142d140b\": container with ID starting with ef50217a4b81b05ed83faea9fd87b9eba5d203195342df862298887c142d140b not found: ID does not exist" Sep 30 08:25:34 crc kubenswrapper[4760]: I0930 08:25:34.388487 4760 scope.go:117] "RemoveContainer" containerID="cfd8ed88490937789ee4bb8809b236c14c7651b6f5842de938617828fa9140c7" Sep 30 08:25:34 crc kubenswrapper[4760]: E0930 08:25:34.388896 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfd8ed88490937789ee4bb8809b236c14c7651b6f5842de938617828fa9140c7\": container with ID starting with cfd8ed88490937789ee4bb8809b236c14c7651b6f5842de938617828fa9140c7 not found: ID does not exist" containerID="cfd8ed88490937789ee4bb8809b236c14c7651b6f5842de938617828fa9140c7" Sep 30 08:25:34 crc kubenswrapper[4760]: I0930 08:25:34.388940 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfd8ed88490937789ee4bb8809b236c14c7651b6f5842de938617828fa9140c7"} err="failed to get container status \"cfd8ed88490937789ee4bb8809b236c14c7651b6f5842de938617828fa9140c7\": rpc error: code = NotFound desc = could not find container \"cfd8ed88490937789ee4bb8809b236c14c7651b6f5842de938617828fa9140c7\": container with ID starting with cfd8ed88490937789ee4bb8809b236c14c7651b6f5842de938617828fa9140c7 not found: ID does not exist" Sep 30 08:25:35 crc kubenswrapper[4760]: I0930 08:25:35.081917 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="923055e3-c184-49fd-a4c3-35f3e4f21e7f" path="/var/lib/kubelet/pods/923055e3-c184-49fd-a4c3-35f3e4f21e7f/volumes" Sep 30 08:25:44 crc kubenswrapper[4760]: I0930 08:25:44.067579 4760 scope.go:117] "RemoveContainer" containerID="ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" Sep 30 08:25:44 crc kubenswrapper[4760]: E0930 08:25:44.068565 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:25:56 crc kubenswrapper[4760]: I0930 08:25:56.067789 4760 scope.go:117] "RemoveContainer" containerID="ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" Sep 30 08:25:56 crc kubenswrapper[4760]: E0930 08:25:56.068804 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:26:07 crc kubenswrapper[4760]: I0930 08:26:07.066765 4760 scope.go:117] "RemoveContainer" containerID="ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" Sep 30 08:26:07 crc kubenswrapper[4760]: E0930 08:26:07.067479 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:26:20 crc kubenswrapper[4760]: I0930 08:26:20.066989 4760 scope.go:117] "RemoveContainer" containerID="ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" Sep 30 08:26:20 crc kubenswrapper[4760]: I0930 08:26:20.818974 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerStarted","Data":"6c5721bea825ec29851d8ca03da59ebad0d536540d36ce6816d3f5ca179ef469"} Sep 30 08:26:50 crc kubenswrapper[4760]: E0930 08:26:50.501146 4760 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.201:56932->38.102.83.201:37703: write tcp 38.102.83.201:56932->38.102.83.201:37703: write: broken pipe Sep 30 08:28:49 crc kubenswrapper[4760]: I0930 08:28:49.112993 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:28:49 crc kubenswrapper[4760]: I0930 08:28:49.113595 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:29:19 crc kubenswrapper[4760]: I0930 08:29:19.113665 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:29:19 crc kubenswrapper[4760]: I0930 08:29:19.114225 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:29:49 crc kubenswrapper[4760]: I0930 08:29:49.113480 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:29:49 crc kubenswrapper[4760]: I0930 08:29:49.114343 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:29:49 crc kubenswrapper[4760]: I0930 08:29:49.114417 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 08:29:49 crc kubenswrapper[4760]: I0930 08:29:49.115616 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6c5721bea825ec29851d8ca03da59ebad0d536540d36ce6816d3f5ca179ef469"} pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 08:29:49 crc kubenswrapper[4760]: I0930 08:29:49.115719 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" containerID="cri-o://6c5721bea825ec29851d8ca03da59ebad0d536540d36ce6816d3f5ca179ef469" gracePeriod=600 Sep 30 08:29:49 crc kubenswrapper[4760]: I0930 08:29:49.322666 4760 generic.go:334] "Generic (PLEG): container finished" podID="7a9c8270-6964-4886-87d0-227b05b76da4" containerID="6c5721bea825ec29851d8ca03da59ebad0d536540d36ce6816d3f5ca179ef469" exitCode=0 Sep 30 08:29:49 crc kubenswrapper[4760]: I0930 08:29:49.322749 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerDied","Data":"6c5721bea825ec29851d8ca03da59ebad0d536540d36ce6816d3f5ca179ef469"} Sep 30 08:29:49 crc kubenswrapper[4760]: I0930 08:29:49.322811 4760 scope.go:117] "RemoveContainer" containerID="ba9d6e9a08a463506562fa972ab309bc744574a4b27aa40630a148b30074629b" Sep 30 08:29:50 crc kubenswrapper[4760]: I0930 08:29:50.337136 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerStarted","Data":"3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a"} Sep 30 08:30:00 crc kubenswrapper[4760]: I0930 08:30:00.185070 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320350-mwrrt"] Sep 30 08:30:00 crc kubenswrapper[4760]: E0930 08:30:00.187611 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="923055e3-c184-49fd-a4c3-35f3e4f21e7f" containerName="extract-utilities" Sep 30 08:30:00 crc kubenswrapper[4760]: I0930 08:30:00.187726 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="923055e3-c184-49fd-a4c3-35f3e4f21e7f" containerName="extract-utilities" Sep 30 08:30:00 crc kubenswrapper[4760]: E0930 08:30:00.187852 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="923055e3-c184-49fd-a4c3-35f3e4f21e7f" containerName="extract-content" Sep 30 08:30:00 crc kubenswrapper[4760]: I0930 08:30:00.187932 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="923055e3-c184-49fd-a4c3-35f3e4f21e7f" containerName="extract-content" Sep 30 08:30:00 crc kubenswrapper[4760]: E0930 08:30:00.188012 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="923055e3-c184-49fd-a4c3-35f3e4f21e7f" containerName="registry-server" Sep 30 08:30:00 crc kubenswrapper[4760]: I0930 08:30:00.188081 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="923055e3-c184-49fd-a4c3-35f3e4f21e7f" containerName="registry-server" Sep 30 08:30:00 crc kubenswrapper[4760]: I0930 08:30:00.188396 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="923055e3-c184-49fd-a4c3-35f3e4f21e7f" containerName="registry-server" Sep 30 08:30:00 crc kubenswrapper[4760]: I0930 08:30:00.189378 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320350-mwrrt" Sep 30 08:30:00 crc kubenswrapper[4760]: I0930 08:30:00.192061 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 08:30:00 crc kubenswrapper[4760]: I0930 08:30:00.192574 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 08:30:00 crc kubenswrapper[4760]: I0930 08:30:00.199044 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320350-mwrrt"] Sep 30 08:30:00 crc kubenswrapper[4760]: I0930 08:30:00.322819 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27bcbc4c-bacd-4e8f-a452-26ab86ade323-secret-volume\") pod \"collect-profiles-29320350-mwrrt\" (UID: \"27bcbc4c-bacd-4e8f-a452-26ab86ade323\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320350-mwrrt" Sep 30 08:30:00 crc kubenswrapper[4760]: I0930 08:30:00.323046 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlcql\" (UniqueName: \"kubernetes.io/projected/27bcbc4c-bacd-4e8f-a452-26ab86ade323-kube-api-access-wlcql\") pod \"collect-profiles-29320350-mwrrt\" (UID: \"27bcbc4c-bacd-4e8f-a452-26ab86ade323\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320350-mwrrt" Sep 30 08:30:00 crc kubenswrapper[4760]: I0930 08:30:00.323491 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27bcbc4c-bacd-4e8f-a452-26ab86ade323-config-volume\") pod \"collect-profiles-29320350-mwrrt\" (UID: \"27bcbc4c-bacd-4e8f-a452-26ab86ade323\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320350-mwrrt" Sep 30 08:30:00 crc kubenswrapper[4760]: I0930 08:30:00.446040 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlcql\" (UniqueName: \"kubernetes.io/projected/27bcbc4c-bacd-4e8f-a452-26ab86ade323-kube-api-access-wlcql\") pod \"collect-profiles-29320350-mwrrt\" (UID: \"27bcbc4c-bacd-4e8f-a452-26ab86ade323\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320350-mwrrt" Sep 30 08:30:00 crc kubenswrapper[4760]: I0930 08:30:00.446200 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27bcbc4c-bacd-4e8f-a452-26ab86ade323-config-volume\") pod \"collect-profiles-29320350-mwrrt\" (UID: \"27bcbc4c-bacd-4e8f-a452-26ab86ade323\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320350-mwrrt" Sep 30 08:30:00 crc kubenswrapper[4760]: I0930 08:30:00.446295 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27bcbc4c-bacd-4e8f-a452-26ab86ade323-secret-volume\") pod \"collect-profiles-29320350-mwrrt\" (UID: \"27bcbc4c-bacd-4e8f-a452-26ab86ade323\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320350-mwrrt" Sep 30 08:30:00 crc kubenswrapper[4760]: I0930 08:30:00.447349 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27bcbc4c-bacd-4e8f-a452-26ab86ade323-config-volume\") pod \"collect-profiles-29320350-mwrrt\" (UID: \"27bcbc4c-bacd-4e8f-a452-26ab86ade323\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320350-mwrrt" Sep 30 08:30:00 crc kubenswrapper[4760]: I0930 08:30:00.459873 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27bcbc4c-bacd-4e8f-a452-26ab86ade323-secret-volume\") pod \"collect-profiles-29320350-mwrrt\" (UID: \"27bcbc4c-bacd-4e8f-a452-26ab86ade323\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320350-mwrrt" Sep 30 08:30:00 crc kubenswrapper[4760]: I0930 08:30:00.465433 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlcql\" (UniqueName: \"kubernetes.io/projected/27bcbc4c-bacd-4e8f-a452-26ab86ade323-kube-api-access-wlcql\") pod \"collect-profiles-29320350-mwrrt\" (UID: \"27bcbc4c-bacd-4e8f-a452-26ab86ade323\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320350-mwrrt" Sep 30 08:30:00 crc kubenswrapper[4760]: I0930 08:30:00.528229 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320350-mwrrt" Sep 30 08:30:01 crc kubenswrapper[4760]: I0930 08:30:01.048653 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320350-mwrrt"] Sep 30 08:30:01 crc kubenswrapper[4760]: I0930 08:30:01.500503 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320350-mwrrt" event={"ID":"27bcbc4c-bacd-4e8f-a452-26ab86ade323","Type":"ContainerStarted","Data":"1bbf173643742246f3e6b7636e45bed0c56428287bfc93470061397958c914bb"} Sep 30 08:30:01 crc kubenswrapper[4760]: I0930 08:30:01.500837 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320350-mwrrt" event={"ID":"27bcbc4c-bacd-4e8f-a452-26ab86ade323","Type":"ContainerStarted","Data":"40c66196c21dcb5ddbff26c8a7dd370a048cb3a569a9b7d8ee82242833f3f281"} Sep 30 08:30:01 crc kubenswrapper[4760]: I0930 08:30:01.519397 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320350-mwrrt" podStartSLOduration=1.519378847 podStartE2EDuration="1.519378847s" podCreationTimestamp="2025-09-30 08:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 08:30:01.514947173 +0000 UTC m=+3387.157853585" watchObservedRunningTime="2025-09-30 08:30:01.519378847 +0000 UTC m=+3387.162285259" Sep 30 08:30:02 crc kubenswrapper[4760]: I0930 08:30:02.516276 4760 generic.go:334] "Generic (PLEG): container finished" podID="27bcbc4c-bacd-4e8f-a452-26ab86ade323" containerID="1bbf173643742246f3e6b7636e45bed0c56428287bfc93470061397958c914bb" exitCode=0 Sep 30 08:30:02 crc kubenswrapper[4760]: I0930 08:30:02.516367 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320350-mwrrt" event={"ID":"27bcbc4c-bacd-4e8f-a452-26ab86ade323","Type":"ContainerDied","Data":"1bbf173643742246f3e6b7636e45bed0c56428287bfc93470061397958c914bb"} Sep 30 08:30:03 crc kubenswrapper[4760]: I0930 08:30:03.959072 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320350-mwrrt" Sep 30 08:30:04 crc kubenswrapper[4760]: I0930 08:30:04.038117 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27bcbc4c-bacd-4e8f-a452-26ab86ade323-secret-volume\") pod \"27bcbc4c-bacd-4e8f-a452-26ab86ade323\" (UID: \"27bcbc4c-bacd-4e8f-a452-26ab86ade323\") " Sep 30 08:30:04 crc kubenswrapper[4760]: I0930 08:30:04.038790 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27bcbc4c-bacd-4e8f-a452-26ab86ade323-config-volume\") pod \"27bcbc4c-bacd-4e8f-a452-26ab86ade323\" (UID: \"27bcbc4c-bacd-4e8f-a452-26ab86ade323\") " Sep 30 08:30:04 crc kubenswrapper[4760]: I0930 08:30:04.039058 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlcql\" (UniqueName: \"kubernetes.io/projected/27bcbc4c-bacd-4e8f-a452-26ab86ade323-kube-api-access-wlcql\") pod \"27bcbc4c-bacd-4e8f-a452-26ab86ade323\" (UID: \"27bcbc4c-bacd-4e8f-a452-26ab86ade323\") " Sep 30 08:30:04 crc kubenswrapper[4760]: I0930 08:30:04.039253 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27bcbc4c-bacd-4e8f-a452-26ab86ade323-config-volume" (OuterVolumeSpecName: "config-volume") pod "27bcbc4c-bacd-4e8f-a452-26ab86ade323" (UID: "27bcbc4c-bacd-4e8f-a452-26ab86ade323"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 08:30:04 crc kubenswrapper[4760]: I0930 08:30:04.040809 4760 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27bcbc4c-bacd-4e8f-a452-26ab86ade323-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 08:30:04 crc kubenswrapper[4760]: I0930 08:30:04.044686 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27bcbc4c-bacd-4e8f-a452-26ab86ade323-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "27bcbc4c-bacd-4e8f-a452-26ab86ade323" (UID: "27bcbc4c-bacd-4e8f-a452-26ab86ade323"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:30:04 crc kubenswrapper[4760]: I0930 08:30:04.045233 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27bcbc4c-bacd-4e8f-a452-26ab86ade323-kube-api-access-wlcql" (OuterVolumeSpecName: "kube-api-access-wlcql") pod "27bcbc4c-bacd-4e8f-a452-26ab86ade323" (UID: "27bcbc4c-bacd-4e8f-a452-26ab86ade323"). InnerVolumeSpecName "kube-api-access-wlcql". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:30:04 crc kubenswrapper[4760]: I0930 08:30:04.143154 4760 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27bcbc4c-bacd-4e8f-a452-26ab86ade323-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 08:30:04 crc kubenswrapper[4760]: I0930 08:30:04.143188 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlcql\" (UniqueName: \"kubernetes.io/projected/27bcbc4c-bacd-4e8f-a452-26ab86ade323-kube-api-access-wlcql\") on node \"crc\" DevicePath \"\"" Sep 30 08:30:04 crc kubenswrapper[4760]: I0930 08:30:04.554654 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320350-mwrrt" event={"ID":"27bcbc4c-bacd-4e8f-a452-26ab86ade323","Type":"ContainerDied","Data":"40c66196c21dcb5ddbff26c8a7dd370a048cb3a569a9b7d8ee82242833f3f281"} Sep 30 08:30:04 crc kubenswrapper[4760]: I0930 08:30:04.555041 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40c66196c21dcb5ddbff26c8a7dd370a048cb3a569a9b7d8ee82242833f3f281" Sep 30 08:30:04 crc kubenswrapper[4760]: I0930 08:30:04.555264 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320350-mwrrt" Sep 30 08:30:04 crc kubenswrapper[4760]: I0930 08:30:04.615329 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320305-v49ds"] Sep 30 08:30:04 crc kubenswrapper[4760]: I0930 08:30:04.623140 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320305-v49ds"] Sep 30 08:30:04 crc kubenswrapper[4760]: E0930 08:30:04.687859 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27bcbc4c_bacd_4e8f_a452_26ab86ade323.slice\": RecentStats: unable to find data in memory cache]" Sep 30 08:30:05 crc kubenswrapper[4760]: I0930 08:30:05.101735 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b410270-7237-429f-bc0f-8d7986cef241" path="/var/lib/kubelet/pods/1b410270-7237-429f-bc0f-8d7986cef241/volumes" Sep 30 08:30:46 crc kubenswrapper[4760]: I0930 08:30:46.537834 4760 scope.go:117] "RemoveContainer" containerID="c3536e5adfdc0661a6265bb9e4fdb956c6f6a8bf2e616a07ec99bbdaa0f8e9af" Sep 30 08:31:44 crc kubenswrapper[4760]: I0930 08:31:44.634222 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7tspm"] Sep 30 08:31:44 crc kubenswrapper[4760]: E0930 08:31:44.635654 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27bcbc4c-bacd-4e8f-a452-26ab86ade323" containerName="collect-profiles" Sep 30 08:31:44 crc kubenswrapper[4760]: I0930 08:31:44.635682 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="27bcbc4c-bacd-4e8f-a452-26ab86ade323" containerName="collect-profiles" Sep 30 08:31:44 crc kubenswrapper[4760]: I0930 08:31:44.636028 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="27bcbc4c-bacd-4e8f-a452-26ab86ade323" containerName="collect-profiles" Sep 30 08:31:44 crc kubenswrapper[4760]: I0930 08:31:44.640835 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7tspm" Sep 30 08:31:44 crc kubenswrapper[4760]: I0930 08:31:44.652002 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7tspm"] Sep 30 08:31:44 crc kubenswrapper[4760]: I0930 08:31:44.720489 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13402845-ba27-4d25-b28d-dda329010df4-catalog-content\") pod \"redhat-operators-7tspm\" (UID: \"13402845-ba27-4d25-b28d-dda329010df4\") " pod="openshift-marketplace/redhat-operators-7tspm" Sep 30 08:31:44 crc kubenswrapper[4760]: I0930 08:31:44.721105 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13402845-ba27-4d25-b28d-dda329010df4-utilities\") pod \"redhat-operators-7tspm\" (UID: \"13402845-ba27-4d25-b28d-dda329010df4\") " pod="openshift-marketplace/redhat-operators-7tspm" Sep 30 08:31:44 crc kubenswrapper[4760]: I0930 08:31:44.721141 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrpdn\" (UniqueName: \"kubernetes.io/projected/13402845-ba27-4d25-b28d-dda329010df4-kube-api-access-jrpdn\") pod \"redhat-operators-7tspm\" (UID: \"13402845-ba27-4d25-b28d-dda329010df4\") " pod="openshift-marketplace/redhat-operators-7tspm" Sep 30 08:31:44 crc kubenswrapper[4760]: I0930 08:31:44.822894 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13402845-ba27-4d25-b28d-dda329010df4-utilities\") pod \"redhat-operators-7tspm\" (UID: \"13402845-ba27-4d25-b28d-dda329010df4\") " pod="openshift-marketplace/redhat-operators-7tspm" Sep 30 08:31:44 crc kubenswrapper[4760]: I0930 08:31:44.822957 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrpdn\" (UniqueName: \"kubernetes.io/projected/13402845-ba27-4d25-b28d-dda329010df4-kube-api-access-jrpdn\") pod \"redhat-operators-7tspm\" (UID: \"13402845-ba27-4d25-b28d-dda329010df4\") " pod="openshift-marketplace/redhat-operators-7tspm" Sep 30 08:31:44 crc kubenswrapper[4760]: I0930 08:31:44.823073 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13402845-ba27-4d25-b28d-dda329010df4-catalog-content\") pod \"redhat-operators-7tspm\" (UID: \"13402845-ba27-4d25-b28d-dda329010df4\") " pod="openshift-marketplace/redhat-operators-7tspm" Sep 30 08:31:44 crc kubenswrapper[4760]: I0930 08:31:44.823758 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13402845-ba27-4d25-b28d-dda329010df4-utilities\") pod \"redhat-operators-7tspm\" (UID: \"13402845-ba27-4d25-b28d-dda329010df4\") " pod="openshift-marketplace/redhat-operators-7tspm" Sep 30 08:31:44 crc kubenswrapper[4760]: I0930 08:31:44.823803 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13402845-ba27-4d25-b28d-dda329010df4-catalog-content\") pod \"redhat-operators-7tspm\" (UID: \"13402845-ba27-4d25-b28d-dda329010df4\") " pod="openshift-marketplace/redhat-operators-7tspm" Sep 30 08:31:44 crc kubenswrapper[4760]: I0930 08:31:44.846282 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrpdn\" (UniqueName: \"kubernetes.io/projected/13402845-ba27-4d25-b28d-dda329010df4-kube-api-access-jrpdn\") pod \"redhat-operators-7tspm\" (UID: \"13402845-ba27-4d25-b28d-dda329010df4\") " pod="openshift-marketplace/redhat-operators-7tspm" Sep 30 08:31:44 crc kubenswrapper[4760]: I0930 08:31:44.975636 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7tspm" Sep 30 08:31:45 crc kubenswrapper[4760]: I0930 08:31:45.512223 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7tspm"] Sep 30 08:31:45 crc kubenswrapper[4760]: I0930 08:31:45.708247 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tspm" event={"ID":"13402845-ba27-4d25-b28d-dda329010df4","Type":"ContainerStarted","Data":"965289c8520e5ddb0ae2e4319371fcd923c017ee6f7344205fed7dc877af92ad"} Sep 30 08:31:45 crc kubenswrapper[4760]: I0930 08:31:45.708625 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tspm" event={"ID":"13402845-ba27-4d25-b28d-dda329010df4","Type":"ContainerStarted","Data":"bc368e39c9fb63f9d491beef69cb167d626b2c7dc997087de70683ee25c9286b"} Sep 30 08:31:46 crc kubenswrapper[4760]: I0930 08:31:46.725294 4760 generic.go:334] "Generic (PLEG): container finished" podID="13402845-ba27-4d25-b28d-dda329010df4" containerID="965289c8520e5ddb0ae2e4319371fcd923c017ee6f7344205fed7dc877af92ad" exitCode=0 Sep 30 08:31:46 crc kubenswrapper[4760]: I0930 08:31:46.725403 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tspm" event={"ID":"13402845-ba27-4d25-b28d-dda329010df4","Type":"ContainerDied","Data":"965289c8520e5ddb0ae2e4319371fcd923c017ee6f7344205fed7dc877af92ad"} Sep 30 08:31:46 crc kubenswrapper[4760]: I0930 08:31:46.728951 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 08:31:48 crc kubenswrapper[4760]: I0930 08:31:48.756501 4760 generic.go:334] "Generic (PLEG): container finished" podID="13402845-ba27-4d25-b28d-dda329010df4" containerID="780a087a026f0f0be28a4631ad39976421c069c09b36eacc2649a68c37dd16c4" exitCode=0 Sep 30 08:31:48 crc kubenswrapper[4760]: I0930 08:31:48.756548 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tspm" event={"ID":"13402845-ba27-4d25-b28d-dda329010df4","Type":"ContainerDied","Data":"780a087a026f0f0be28a4631ad39976421c069c09b36eacc2649a68c37dd16c4"} Sep 30 08:31:49 crc kubenswrapper[4760]: I0930 08:31:49.775972 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tspm" event={"ID":"13402845-ba27-4d25-b28d-dda329010df4","Type":"ContainerStarted","Data":"76e368d001b8c080e5176ee0ba6d7fcbe4d24c7fe6105c4705c845c679b7a03a"} Sep 30 08:31:50 crc kubenswrapper[4760]: I0930 08:31:50.823827 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7tspm" podStartSLOduration=4.251437631 podStartE2EDuration="6.823809099s" podCreationTimestamp="2025-09-30 08:31:44 +0000 UTC" firstStartedPulling="2025-09-30 08:31:46.728526631 +0000 UTC m=+3492.371433083" lastFinishedPulling="2025-09-30 08:31:49.300898099 +0000 UTC m=+3494.943804551" observedRunningTime="2025-09-30 08:31:50.822508005 +0000 UTC m=+3496.465414447" watchObservedRunningTime="2025-09-30 08:31:50.823809099 +0000 UTC m=+3496.466715521" Sep 30 08:31:54 crc kubenswrapper[4760]: I0930 08:31:54.975934 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7tspm" Sep 30 08:31:54 crc kubenswrapper[4760]: I0930 08:31:54.976557 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7tspm" Sep 30 08:31:56 crc kubenswrapper[4760]: I0930 08:31:56.041052 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7tspm" podUID="13402845-ba27-4d25-b28d-dda329010df4" containerName="registry-server" probeResult="failure" output=< Sep 30 08:31:56 crc kubenswrapper[4760]: timeout: failed to connect service ":50051" within 1s Sep 30 08:31:56 crc kubenswrapper[4760]: > Sep 30 08:32:05 crc kubenswrapper[4760]: I0930 08:32:05.062638 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7tspm" Sep 30 08:32:05 crc kubenswrapper[4760]: I0930 08:32:05.137240 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7tspm" Sep 30 08:32:05 crc kubenswrapper[4760]: I0930 08:32:05.306821 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7tspm"] Sep 30 08:32:06 crc kubenswrapper[4760]: I0930 08:32:06.996525 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7tspm" podUID="13402845-ba27-4d25-b28d-dda329010df4" containerName="registry-server" containerID="cri-o://76e368d001b8c080e5176ee0ba6d7fcbe4d24c7fe6105c4705c845c679b7a03a" gracePeriod=2 Sep 30 08:32:07 crc kubenswrapper[4760]: I0930 08:32:07.525577 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7tspm" Sep 30 08:32:07 crc kubenswrapper[4760]: I0930 08:32:07.639237 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13402845-ba27-4d25-b28d-dda329010df4-utilities\") pod \"13402845-ba27-4d25-b28d-dda329010df4\" (UID: \"13402845-ba27-4d25-b28d-dda329010df4\") " Sep 30 08:32:07 crc kubenswrapper[4760]: I0930 08:32:07.639331 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrpdn\" (UniqueName: \"kubernetes.io/projected/13402845-ba27-4d25-b28d-dda329010df4-kube-api-access-jrpdn\") pod \"13402845-ba27-4d25-b28d-dda329010df4\" (UID: \"13402845-ba27-4d25-b28d-dda329010df4\") " Sep 30 08:32:07 crc kubenswrapper[4760]: I0930 08:32:07.639385 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13402845-ba27-4d25-b28d-dda329010df4-catalog-content\") pod \"13402845-ba27-4d25-b28d-dda329010df4\" (UID: \"13402845-ba27-4d25-b28d-dda329010df4\") " Sep 30 08:32:07 crc kubenswrapper[4760]: I0930 08:32:07.640205 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13402845-ba27-4d25-b28d-dda329010df4-utilities" (OuterVolumeSpecName: "utilities") pod "13402845-ba27-4d25-b28d-dda329010df4" (UID: "13402845-ba27-4d25-b28d-dda329010df4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:32:07 crc kubenswrapper[4760]: I0930 08:32:07.640575 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13402845-ba27-4d25-b28d-dda329010df4-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 08:32:07 crc kubenswrapper[4760]: I0930 08:32:07.647536 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13402845-ba27-4d25-b28d-dda329010df4-kube-api-access-jrpdn" (OuterVolumeSpecName: "kube-api-access-jrpdn") pod "13402845-ba27-4d25-b28d-dda329010df4" (UID: "13402845-ba27-4d25-b28d-dda329010df4"). InnerVolumeSpecName "kube-api-access-jrpdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:32:07 crc kubenswrapper[4760]: I0930 08:32:07.739927 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13402845-ba27-4d25-b28d-dda329010df4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13402845-ba27-4d25-b28d-dda329010df4" (UID: "13402845-ba27-4d25-b28d-dda329010df4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:32:07 crc kubenswrapper[4760]: I0930 08:32:07.742045 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13402845-ba27-4d25-b28d-dda329010df4-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 08:32:07 crc kubenswrapper[4760]: I0930 08:32:07.742074 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrpdn\" (UniqueName: \"kubernetes.io/projected/13402845-ba27-4d25-b28d-dda329010df4-kube-api-access-jrpdn\") on node \"crc\" DevicePath \"\"" Sep 30 08:32:08 crc kubenswrapper[4760]: I0930 08:32:08.035725 4760 generic.go:334] "Generic (PLEG): container finished" podID="13402845-ba27-4d25-b28d-dda329010df4" containerID="76e368d001b8c080e5176ee0ba6d7fcbe4d24c7fe6105c4705c845c679b7a03a" exitCode=0 Sep 30 08:32:08 crc kubenswrapper[4760]: I0930 08:32:08.035842 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tspm" event={"ID":"13402845-ba27-4d25-b28d-dda329010df4","Type":"ContainerDied","Data":"76e368d001b8c080e5176ee0ba6d7fcbe4d24c7fe6105c4705c845c679b7a03a"} Sep 30 08:32:08 crc kubenswrapper[4760]: I0930 08:32:08.035912 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tspm" event={"ID":"13402845-ba27-4d25-b28d-dda329010df4","Type":"ContainerDied","Data":"bc368e39c9fb63f9d491beef69cb167d626b2c7dc997087de70683ee25c9286b"} Sep 30 08:32:08 crc kubenswrapper[4760]: I0930 08:32:08.035981 4760 scope.go:117] "RemoveContainer" containerID="76e368d001b8c080e5176ee0ba6d7fcbe4d24c7fe6105c4705c845c679b7a03a" Sep 30 08:32:08 crc kubenswrapper[4760]: I0930 08:32:08.036512 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7tspm" Sep 30 08:32:08 crc kubenswrapper[4760]: I0930 08:32:08.079043 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7tspm"] Sep 30 08:32:08 crc kubenswrapper[4760]: I0930 08:32:08.088123 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7tspm"] Sep 30 08:32:08 crc kubenswrapper[4760]: I0930 08:32:08.088784 4760 scope.go:117] "RemoveContainer" containerID="780a087a026f0f0be28a4631ad39976421c069c09b36eacc2649a68c37dd16c4" Sep 30 08:32:08 crc kubenswrapper[4760]: I0930 08:32:08.117799 4760 scope.go:117] "RemoveContainer" containerID="965289c8520e5ddb0ae2e4319371fcd923c017ee6f7344205fed7dc877af92ad" Sep 30 08:32:08 crc kubenswrapper[4760]: I0930 08:32:08.180923 4760 scope.go:117] "RemoveContainer" containerID="76e368d001b8c080e5176ee0ba6d7fcbe4d24c7fe6105c4705c845c679b7a03a" Sep 30 08:32:08 crc kubenswrapper[4760]: E0930 08:32:08.181553 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76e368d001b8c080e5176ee0ba6d7fcbe4d24c7fe6105c4705c845c679b7a03a\": container with ID starting with 76e368d001b8c080e5176ee0ba6d7fcbe4d24c7fe6105c4705c845c679b7a03a not found: ID does not exist" containerID="76e368d001b8c080e5176ee0ba6d7fcbe4d24c7fe6105c4705c845c679b7a03a" Sep 30 08:32:08 crc kubenswrapper[4760]: I0930 08:32:08.181601 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76e368d001b8c080e5176ee0ba6d7fcbe4d24c7fe6105c4705c845c679b7a03a"} err="failed to get container status \"76e368d001b8c080e5176ee0ba6d7fcbe4d24c7fe6105c4705c845c679b7a03a\": rpc error: code = NotFound desc = could not find container \"76e368d001b8c080e5176ee0ba6d7fcbe4d24c7fe6105c4705c845c679b7a03a\": container with ID starting with 76e368d001b8c080e5176ee0ba6d7fcbe4d24c7fe6105c4705c845c679b7a03a not found: ID does not exist" Sep 30 08:32:08 crc kubenswrapper[4760]: I0930 08:32:08.181634 4760 scope.go:117] "RemoveContainer" containerID="780a087a026f0f0be28a4631ad39976421c069c09b36eacc2649a68c37dd16c4" Sep 30 08:32:08 crc kubenswrapper[4760]: E0930 08:32:08.182181 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"780a087a026f0f0be28a4631ad39976421c069c09b36eacc2649a68c37dd16c4\": container with ID starting with 780a087a026f0f0be28a4631ad39976421c069c09b36eacc2649a68c37dd16c4 not found: ID does not exist" containerID="780a087a026f0f0be28a4631ad39976421c069c09b36eacc2649a68c37dd16c4" Sep 30 08:32:08 crc kubenswrapper[4760]: I0930 08:32:08.182215 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"780a087a026f0f0be28a4631ad39976421c069c09b36eacc2649a68c37dd16c4"} err="failed to get container status \"780a087a026f0f0be28a4631ad39976421c069c09b36eacc2649a68c37dd16c4\": rpc error: code = NotFound desc = could not find container \"780a087a026f0f0be28a4631ad39976421c069c09b36eacc2649a68c37dd16c4\": container with ID starting with 780a087a026f0f0be28a4631ad39976421c069c09b36eacc2649a68c37dd16c4 not found: ID does not exist" Sep 30 08:32:08 crc kubenswrapper[4760]: I0930 08:32:08.182237 4760 scope.go:117] "RemoveContainer" containerID="965289c8520e5ddb0ae2e4319371fcd923c017ee6f7344205fed7dc877af92ad" Sep 30 08:32:08 crc kubenswrapper[4760]: E0930 08:32:08.182859 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"965289c8520e5ddb0ae2e4319371fcd923c017ee6f7344205fed7dc877af92ad\": container with ID starting with 965289c8520e5ddb0ae2e4319371fcd923c017ee6f7344205fed7dc877af92ad not found: ID does not exist" containerID="965289c8520e5ddb0ae2e4319371fcd923c017ee6f7344205fed7dc877af92ad" Sep 30 08:32:08 crc kubenswrapper[4760]: I0930 08:32:08.182902 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"965289c8520e5ddb0ae2e4319371fcd923c017ee6f7344205fed7dc877af92ad"} err="failed to get container status \"965289c8520e5ddb0ae2e4319371fcd923c017ee6f7344205fed7dc877af92ad\": rpc error: code = NotFound desc = could not find container \"965289c8520e5ddb0ae2e4319371fcd923c017ee6f7344205fed7dc877af92ad\": container with ID starting with 965289c8520e5ddb0ae2e4319371fcd923c017ee6f7344205fed7dc877af92ad not found: ID does not exist" Sep 30 08:32:08 crc kubenswrapper[4760]: E0930 08:32:08.276821 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13402845_ba27_4d25_b28d_dda329010df4.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13402845_ba27_4d25_b28d_dda329010df4.slice/crio-bc368e39c9fb63f9d491beef69cb167d626b2c7dc997087de70683ee25c9286b\": RecentStats: unable to find data in memory cache]" Sep 30 08:32:09 crc kubenswrapper[4760]: I0930 08:32:09.082069 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13402845-ba27-4d25-b28d-dda329010df4" path="/var/lib/kubelet/pods/13402845-ba27-4d25-b28d-dda329010df4/volumes" Sep 30 08:32:19 crc kubenswrapper[4760]: I0930 08:32:19.113677 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:32:19 crc kubenswrapper[4760]: I0930 08:32:19.114508 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:32:49 crc kubenswrapper[4760]: I0930 08:32:49.113062 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:32:49 crc kubenswrapper[4760]: I0930 08:32:49.113698 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:33:19 crc kubenswrapper[4760]: I0930 08:33:19.113391 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:33:19 crc kubenswrapper[4760]: I0930 08:33:19.113975 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:33:19 crc kubenswrapper[4760]: I0930 08:33:19.114017 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 08:33:19 crc kubenswrapper[4760]: I0930 08:33:19.114734 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a"} pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 08:33:19 crc kubenswrapper[4760]: I0930 08:33:19.114780 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" containerID="cri-o://3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a" gracePeriod=600 Sep 30 08:33:19 crc kubenswrapper[4760]: E0930 08:33:19.242084 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:33:19 crc kubenswrapper[4760]: I0930 08:33:19.848629 4760 generic.go:334] "Generic (PLEG): container finished" podID="7a9c8270-6964-4886-87d0-227b05b76da4" containerID="3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a" exitCode=0 Sep 30 08:33:19 crc kubenswrapper[4760]: I0930 08:33:19.848704 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerDied","Data":"3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a"} Sep 30 08:33:19 crc kubenswrapper[4760]: I0930 08:33:19.849095 4760 scope.go:117] "RemoveContainer" containerID="6c5721bea825ec29851d8ca03da59ebad0d536540d36ce6816d3f5ca179ef469" Sep 30 08:33:19 crc kubenswrapper[4760]: I0930 08:33:19.850190 4760 scope.go:117] "RemoveContainer" containerID="3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a" Sep 30 08:33:19 crc kubenswrapper[4760]: E0930 08:33:19.850716 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:33:32 crc kubenswrapper[4760]: I0930 08:33:32.067420 4760 scope.go:117] "RemoveContainer" containerID="3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a" Sep 30 08:33:32 crc kubenswrapper[4760]: E0930 08:33:32.068367 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:33:45 crc kubenswrapper[4760]: I0930 08:33:45.081626 4760 scope.go:117] "RemoveContainer" containerID="3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a" Sep 30 08:33:45 crc kubenswrapper[4760]: E0930 08:33:45.084569 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:33:52 crc kubenswrapper[4760]: I0930 08:33:52.582372 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-992gm"] Sep 30 08:33:52 crc kubenswrapper[4760]: E0930 08:33:52.583365 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13402845-ba27-4d25-b28d-dda329010df4" containerName="extract-content" Sep 30 08:33:52 crc kubenswrapper[4760]: I0930 08:33:52.583380 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="13402845-ba27-4d25-b28d-dda329010df4" containerName="extract-content" Sep 30 08:33:52 crc kubenswrapper[4760]: E0930 08:33:52.583393 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13402845-ba27-4d25-b28d-dda329010df4" containerName="extract-utilities" Sep 30 08:33:52 crc kubenswrapper[4760]: I0930 08:33:52.583402 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="13402845-ba27-4d25-b28d-dda329010df4" containerName="extract-utilities" Sep 30 08:33:52 crc kubenswrapper[4760]: E0930 08:33:52.583437 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13402845-ba27-4d25-b28d-dda329010df4" containerName="registry-server" Sep 30 08:33:52 crc kubenswrapper[4760]: I0930 08:33:52.583445 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="13402845-ba27-4d25-b28d-dda329010df4" containerName="registry-server" Sep 30 08:33:52 crc kubenswrapper[4760]: I0930 08:33:52.583674 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="13402845-ba27-4d25-b28d-dda329010df4" containerName="registry-server" Sep 30 08:33:52 crc kubenswrapper[4760]: I0930 08:33:52.586017 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-992gm" Sep 30 08:33:52 crc kubenswrapper[4760]: I0930 08:33:52.593827 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-992gm"] Sep 30 08:33:52 crc kubenswrapper[4760]: I0930 08:33:52.685119 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef0b5e8c-2c02-4962-a789-70cec4f9d18b-utilities\") pod \"community-operators-992gm\" (UID: \"ef0b5e8c-2c02-4962-a789-70cec4f9d18b\") " pod="openshift-marketplace/community-operators-992gm" Sep 30 08:33:52 crc kubenswrapper[4760]: I0930 08:33:52.685213 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef0b5e8c-2c02-4962-a789-70cec4f9d18b-catalog-content\") pod \"community-operators-992gm\" (UID: \"ef0b5e8c-2c02-4962-a789-70cec4f9d18b\") " pod="openshift-marketplace/community-operators-992gm" Sep 30 08:33:52 crc kubenswrapper[4760]: I0930 08:33:52.685314 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfgwv\" (UniqueName: \"kubernetes.io/projected/ef0b5e8c-2c02-4962-a789-70cec4f9d18b-kube-api-access-vfgwv\") pod \"community-operators-992gm\" (UID: \"ef0b5e8c-2c02-4962-a789-70cec4f9d18b\") " pod="openshift-marketplace/community-operators-992gm" Sep 30 08:33:52 crc kubenswrapper[4760]: I0930 08:33:52.787531 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef0b5e8c-2c02-4962-a789-70cec4f9d18b-catalog-content\") pod \"community-operators-992gm\" (UID: \"ef0b5e8c-2c02-4962-a789-70cec4f9d18b\") " pod="openshift-marketplace/community-operators-992gm" Sep 30 08:33:52 crc kubenswrapper[4760]: I0930 08:33:52.787628 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfgwv\" (UniqueName: \"kubernetes.io/projected/ef0b5e8c-2c02-4962-a789-70cec4f9d18b-kube-api-access-vfgwv\") pod \"community-operators-992gm\" (UID: \"ef0b5e8c-2c02-4962-a789-70cec4f9d18b\") " pod="openshift-marketplace/community-operators-992gm" Sep 30 08:33:52 crc kubenswrapper[4760]: I0930 08:33:52.787800 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef0b5e8c-2c02-4962-a789-70cec4f9d18b-utilities\") pod \"community-operators-992gm\" (UID: \"ef0b5e8c-2c02-4962-a789-70cec4f9d18b\") " pod="openshift-marketplace/community-operators-992gm" Sep 30 08:33:52 crc kubenswrapper[4760]: I0930 08:33:52.788429 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef0b5e8c-2c02-4962-a789-70cec4f9d18b-catalog-content\") pod \"community-operators-992gm\" (UID: \"ef0b5e8c-2c02-4962-a789-70cec4f9d18b\") " pod="openshift-marketplace/community-operators-992gm" Sep 30 08:33:52 crc kubenswrapper[4760]: I0930 08:33:52.788777 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef0b5e8c-2c02-4962-a789-70cec4f9d18b-utilities\") pod \"community-operators-992gm\" (UID: \"ef0b5e8c-2c02-4962-a789-70cec4f9d18b\") " pod="openshift-marketplace/community-operators-992gm" Sep 30 08:33:52 crc kubenswrapper[4760]: I0930 08:33:52.815225 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfgwv\" (UniqueName: \"kubernetes.io/projected/ef0b5e8c-2c02-4962-a789-70cec4f9d18b-kube-api-access-vfgwv\") pod \"community-operators-992gm\" (UID: \"ef0b5e8c-2c02-4962-a789-70cec4f9d18b\") " pod="openshift-marketplace/community-operators-992gm" Sep 30 08:33:52 crc kubenswrapper[4760]: I0930 08:33:52.913816 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-992gm" Sep 30 08:33:53 crc kubenswrapper[4760]: I0930 08:33:53.405655 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-992gm"] Sep 30 08:33:54 crc kubenswrapper[4760]: I0930 08:33:54.241748 4760 generic.go:334] "Generic (PLEG): container finished" podID="ef0b5e8c-2c02-4962-a789-70cec4f9d18b" containerID="1c4cddb431be8b3c62dccf16e9fc53e970858aee69e6e1c8c0971b7d503bfbd6" exitCode=0 Sep 30 08:33:54 crc kubenswrapper[4760]: I0930 08:33:54.241847 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-992gm" event={"ID":"ef0b5e8c-2c02-4962-a789-70cec4f9d18b","Type":"ContainerDied","Data":"1c4cddb431be8b3c62dccf16e9fc53e970858aee69e6e1c8c0971b7d503bfbd6"} Sep 30 08:33:54 crc kubenswrapper[4760]: I0930 08:33:54.242214 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-992gm" event={"ID":"ef0b5e8c-2c02-4962-a789-70cec4f9d18b","Type":"ContainerStarted","Data":"3a2088f5fbad661f14aeda0ef9bf4a0b53104b6553df2f8ca701afa991851997"} Sep 30 08:33:55 crc kubenswrapper[4760]: I0930 08:33:55.253514 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-992gm" event={"ID":"ef0b5e8c-2c02-4962-a789-70cec4f9d18b","Type":"ContainerStarted","Data":"dda59f9cefdf6580eabeb64461c031482927c39d1e754d0bf3cb10808a26a8eb"} Sep 30 08:33:57 crc kubenswrapper[4760]: I0930 08:33:57.283029 4760 generic.go:334] "Generic (PLEG): container finished" podID="ef0b5e8c-2c02-4962-a789-70cec4f9d18b" containerID="dda59f9cefdf6580eabeb64461c031482927c39d1e754d0bf3cb10808a26a8eb" exitCode=0 Sep 30 08:33:57 crc kubenswrapper[4760]: I0930 08:33:57.283132 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-992gm" event={"ID":"ef0b5e8c-2c02-4962-a789-70cec4f9d18b","Type":"ContainerDied","Data":"dda59f9cefdf6580eabeb64461c031482927c39d1e754d0bf3cb10808a26a8eb"} Sep 30 08:33:58 crc kubenswrapper[4760]: I0930 08:33:58.068086 4760 scope.go:117] "RemoveContainer" containerID="3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a" Sep 30 08:33:58 crc kubenswrapper[4760]: E0930 08:33:58.068885 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:33:58 crc kubenswrapper[4760]: I0930 08:33:58.297084 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-992gm" event={"ID":"ef0b5e8c-2c02-4962-a789-70cec4f9d18b","Type":"ContainerStarted","Data":"966bc57b66f13dbe8961056b488e3d82452bb634451004b80e2b8cf2a98eef15"} Sep 30 08:33:58 crc kubenswrapper[4760]: I0930 08:33:58.331397 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-992gm" podStartSLOduration=2.7936004150000002 podStartE2EDuration="6.331375246s" podCreationTimestamp="2025-09-30 08:33:52 +0000 UTC" firstStartedPulling="2025-09-30 08:33:54.244093496 +0000 UTC m=+3619.886999908" lastFinishedPulling="2025-09-30 08:33:57.781868307 +0000 UTC m=+3623.424774739" observedRunningTime="2025-09-30 08:33:58.323025093 +0000 UTC m=+3623.965931505" watchObservedRunningTime="2025-09-30 08:33:58.331375246 +0000 UTC m=+3623.974281678" Sep 30 08:34:02 crc kubenswrapper[4760]: I0930 08:34:02.914715 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-992gm" Sep 30 08:34:02 crc kubenswrapper[4760]: I0930 08:34:02.915435 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-992gm" Sep 30 08:34:02 crc kubenswrapper[4760]: I0930 08:34:02.978457 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-992gm" Sep 30 08:34:03 crc kubenswrapper[4760]: I0930 08:34:03.413508 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-992gm" Sep 30 08:34:03 crc kubenswrapper[4760]: I0930 08:34:03.493556 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-992gm"] Sep 30 08:34:05 crc kubenswrapper[4760]: I0930 08:34:05.369664 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-992gm" podUID="ef0b5e8c-2c02-4962-a789-70cec4f9d18b" containerName="registry-server" containerID="cri-o://966bc57b66f13dbe8961056b488e3d82452bb634451004b80e2b8cf2a98eef15" gracePeriod=2 Sep 30 08:34:05 crc kubenswrapper[4760]: I0930 08:34:05.929050 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-992gm" Sep 30 08:34:05 crc kubenswrapper[4760]: I0930 08:34:05.963517 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef0b5e8c-2c02-4962-a789-70cec4f9d18b-utilities\") pod \"ef0b5e8c-2c02-4962-a789-70cec4f9d18b\" (UID: \"ef0b5e8c-2c02-4962-a789-70cec4f9d18b\") " Sep 30 08:34:05 crc kubenswrapper[4760]: I0930 08:34:05.963634 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfgwv\" (UniqueName: \"kubernetes.io/projected/ef0b5e8c-2c02-4962-a789-70cec4f9d18b-kube-api-access-vfgwv\") pod \"ef0b5e8c-2c02-4962-a789-70cec4f9d18b\" (UID: \"ef0b5e8c-2c02-4962-a789-70cec4f9d18b\") " Sep 30 08:34:05 crc kubenswrapper[4760]: I0930 08:34:05.963797 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef0b5e8c-2c02-4962-a789-70cec4f9d18b-catalog-content\") pod \"ef0b5e8c-2c02-4962-a789-70cec4f9d18b\" (UID: \"ef0b5e8c-2c02-4962-a789-70cec4f9d18b\") " Sep 30 08:34:05 crc kubenswrapper[4760]: I0930 08:34:05.964626 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef0b5e8c-2c02-4962-a789-70cec4f9d18b-utilities" (OuterVolumeSpecName: "utilities") pod "ef0b5e8c-2c02-4962-a789-70cec4f9d18b" (UID: "ef0b5e8c-2c02-4962-a789-70cec4f9d18b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:34:05 crc kubenswrapper[4760]: I0930 08:34:05.975695 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef0b5e8c-2c02-4962-a789-70cec4f9d18b-kube-api-access-vfgwv" (OuterVolumeSpecName: "kube-api-access-vfgwv") pod "ef0b5e8c-2c02-4962-a789-70cec4f9d18b" (UID: "ef0b5e8c-2c02-4962-a789-70cec4f9d18b"). InnerVolumeSpecName "kube-api-access-vfgwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:34:06 crc kubenswrapper[4760]: I0930 08:34:06.066055 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef0b5e8c-2c02-4962-a789-70cec4f9d18b-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 08:34:06 crc kubenswrapper[4760]: I0930 08:34:06.066092 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfgwv\" (UniqueName: \"kubernetes.io/projected/ef0b5e8c-2c02-4962-a789-70cec4f9d18b-kube-api-access-vfgwv\") on node \"crc\" DevicePath \"\"" Sep 30 08:34:06 crc kubenswrapper[4760]: I0930 08:34:06.383036 4760 generic.go:334] "Generic (PLEG): container finished" podID="ef0b5e8c-2c02-4962-a789-70cec4f9d18b" containerID="966bc57b66f13dbe8961056b488e3d82452bb634451004b80e2b8cf2a98eef15" exitCode=0 Sep 30 08:34:06 crc kubenswrapper[4760]: I0930 08:34:06.383083 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-992gm" event={"ID":"ef0b5e8c-2c02-4962-a789-70cec4f9d18b","Type":"ContainerDied","Data":"966bc57b66f13dbe8961056b488e3d82452bb634451004b80e2b8cf2a98eef15"} Sep 30 08:34:06 crc kubenswrapper[4760]: I0930 08:34:06.383113 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-992gm" event={"ID":"ef0b5e8c-2c02-4962-a789-70cec4f9d18b","Type":"ContainerDied","Data":"3a2088f5fbad661f14aeda0ef9bf4a0b53104b6553df2f8ca701afa991851997"} Sep 30 08:34:06 crc kubenswrapper[4760]: I0930 08:34:06.383132 4760 scope.go:117] "RemoveContainer" containerID="966bc57b66f13dbe8961056b488e3d82452bb634451004b80e2b8cf2a98eef15" Sep 30 08:34:06 crc kubenswrapper[4760]: I0930 08:34:06.383147 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-992gm" Sep 30 08:34:06 crc kubenswrapper[4760]: I0930 08:34:06.415269 4760 scope.go:117] "RemoveContainer" containerID="dda59f9cefdf6580eabeb64461c031482927c39d1e754d0bf3cb10808a26a8eb" Sep 30 08:34:06 crc kubenswrapper[4760]: I0930 08:34:06.446981 4760 scope.go:117] "RemoveContainer" containerID="1c4cddb431be8b3c62dccf16e9fc53e970858aee69e6e1c8c0971b7d503bfbd6" Sep 30 08:34:06 crc kubenswrapper[4760]: I0930 08:34:06.451836 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef0b5e8c-2c02-4962-a789-70cec4f9d18b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef0b5e8c-2c02-4962-a789-70cec4f9d18b" (UID: "ef0b5e8c-2c02-4962-a789-70cec4f9d18b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:34:06 crc kubenswrapper[4760]: I0930 08:34:06.475934 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef0b5e8c-2c02-4962-a789-70cec4f9d18b-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 08:34:06 crc kubenswrapper[4760]: I0930 08:34:06.538369 4760 scope.go:117] "RemoveContainer" containerID="966bc57b66f13dbe8961056b488e3d82452bb634451004b80e2b8cf2a98eef15" Sep 30 08:34:06 crc kubenswrapper[4760]: E0930 08:34:06.539439 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"966bc57b66f13dbe8961056b488e3d82452bb634451004b80e2b8cf2a98eef15\": container with ID starting with 966bc57b66f13dbe8961056b488e3d82452bb634451004b80e2b8cf2a98eef15 not found: ID does not exist" containerID="966bc57b66f13dbe8961056b488e3d82452bb634451004b80e2b8cf2a98eef15" Sep 30 08:34:06 crc kubenswrapper[4760]: I0930 08:34:06.539527 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"966bc57b66f13dbe8961056b488e3d82452bb634451004b80e2b8cf2a98eef15"} err="failed to get container status \"966bc57b66f13dbe8961056b488e3d82452bb634451004b80e2b8cf2a98eef15\": rpc error: code = NotFound desc = could not find container \"966bc57b66f13dbe8961056b488e3d82452bb634451004b80e2b8cf2a98eef15\": container with ID starting with 966bc57b66f13dbe8961056b488e3d82452bb634451004b80e2b8cf2a98eef15 not found: ID does not exist" Sep 30 08:34:06 crc kubenswrapper[4760]: I0930 08:34:06.539576 4760 scope.go:117] "RemoveContainer" containerID="dda59f9cefdf6580eabeb64461c031482927c39d1e754d0bf3cb10808a26a8eb" Sep 30 08:34:06 crc kubenswrapper[4760]: E0930 08:34:06.539899 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dda59f9cefdf6580eabeb64461c031482927c39d1e754d0bf3cb10808a26a8eb\": container with ID starting with dda59f9cefdf6580eabeb64461c031482927c39d1e754d0bf3cb10808a26a8eb not found: ID does not exist" containerID="dda59f9cefdf6580eabeb64461c031482927c39d1e754d0bf3cb10808a26a8eb" Sep 30 08:34:06 crc kubenswrapper[4760]: I0930 08:34:06.539959 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dda59f9cefdf6580eabeb64461c031482927c39d1e754d0bf3cb10808a26a8eb"} err="failed to get container status \"dda59f9cefdf6580eabeb64461c031482927c39d1e754d0bf3cb10808a26a8eb\": rpc error: code = NotFound desc = could not find container \"dda59f9cefdf6580eabeb64461c031482927c39d1e754d0bf3cb10808a26a8eb\": container with ID starting with dda59f9cefdf6580eabeb64461c031482927c39d1e754d0bf3cb10808a26a8eb not found: ID does not exist" Sep 30 08:34:06 crc kubenswrapper[4760]: I0930 08:34:06.540007 4760 scope.go:117] "RemoveContainer" containerID="1c4cddb431be8b3c62dccf16e9fc53e970858aee69e6e1c8c0971b7d503bfbd6" Sep 30 08:34:06 crc kubenswrapper[4760]: E0930 08:34:06.540574 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c4cddb431be8b3c62dccf16e9fc53e970858aee69e6e1c8c0971b7d503bfbd6\": container with ID starting with 1c4cddb431be8b3c62dccf16e9fc53e970858aee69e6e1c8c0971b7d503bfbd6 not found: ID does not exist" containerID="1c4cddb431be8b3c62dccf16e9fc53e970858aee69e6e1c8c0971b7d503bfbd6" Sep 30 08:34:06 crc kubenswrapper[4760]: I0930 08:34:06.540618 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c4cddb431be8b3c62dccf16e9fc53e970858aee69e6e1c8c0971b7d503bfbd6"} err="failed to get container status \"1c4cddb431be8b3c62dccf16e9fc53e970858aee69e6e1c8c0971b7d503bfbd6\": rpc error: code = NotFound desc = could not find container \"1c4cddb431be8b3c62dccf16e9fc53e970858aee69e6e1c8c0971b7d503bfbd6\": container with ID starting with 1c4cddb431be8b3c62dccf16e9fc53e970858aee69e6e1c8c0971b7d503bfbd6 not found: ID does not exist" Sep 30 08:34:06 crc kubenswrapper[4760]: I0930 08:34:06.715558 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-992gm"] Sep 30 08:34:06 crc kubenswrapper[4760]: I0930 08:34:06.725280 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-992gm"] Sep 30 08:34:07 crc kubenswrapper[4760]: I0930 08:34:07.084848 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef0b5e8c-2c02-4962-a789-70cec4f9d18b" path="/var/lib/kubelet/pods/ef0b5e8c-2c02-4962-a789-70cec4f9d18b/volumes" Sep 30 08:34:13 crc kubenswrapper[4760]: I0930 08:34:13.066865 4760 scope.go:117] "RemoveContainer" containerID="3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a" Sep 30 08:34:13 crc kubenswrapper[4760]: E0930 08:34:13.068564 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:34:25 crc kubenswrapper[4760]: I0930 08:34:25.073848 4760 scope.go:117] "RemoveContainer" containerID="3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a" Sep 30 08:34:25 crc kubenswrapper[4760]: E0930 08:34:25.074854 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:34:37 crc kubenswrapper[4760]: I0930 08:34:37.068567 4760 scope.go:117] "RemoveContainer" containerID="3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a" Sep 30 08:34:37 crc kubenswrapper[4760]: E0930 08:34:37.069609 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:34:50 crc kubenswrapper[4760]: I0930 08:34:50.067187 4760 scope.go:117] "RemoveContainer" containerID="3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a" Sep 30 08:34:50 crc kubenswrapper[4760]: E0930 08:34:50.067993 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:34:52 crc kubenswrapper[4760]: I0930 08:34:52.690831 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-97kq4"] Sep 30 08:34:52 crc kubenswrapper[4760]: E0930 08:34:52.692073 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef0b5e8c-2c02-4962-a789-70cec4f9d18b" containerName="extract-utilities" Sep 30 08:34:52 crc kubenswrapper[4760]: I0930 08:34:52.692104 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef0b5e8c-2c02-4962-a789-70cec4f9d18b" containerName="extract-utilities" Sep 30 08:34:52 crc kubenswrapper[4760]: E0930 08:34:52.692181 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef0b5e8c-2c02-4962-a789-70cec4f9d18b" containerName="registry-server" Sep 30 08:34:52 crc kubenswrapper[4760]: I0930 08:34:52.692200 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef0b5e8c-2c02-4962-a789-70cec4f9d18b" containerName="registry-server" Sep 30 08:34:52 crc kubenswrapper[4760]: E0930 08:34:52.692237 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef0b5e8c-2c02-4962-a789-70cec4f9d18b" containerName="extract-content" Sep 30 08:34:52 crc kubenswrapper[4760]: I0930 08:34:52.692258 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef0b5e8c-2c02-4962-a789-70cec4f9d18b" containerName="extract-content" Sep 30 08:34:52 crc kubenswrapper[4760]: I0930 08:34:52.692785 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef0b5e8c-2c02-4962-a789-70cec4f9d18b" containerName="registry-server" Sep 30 08:34:52 crc kubenswrapper[4760]: I0930 08:34:52.696091 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-97kq4" Sep 30 08:34:52 crc kubenswrapper[4760]: I0930 08:34:52.716549 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-97kq4"] Sep 30 08:34:52 crc kubenswrapper[4760]: I0930 08:34:52.798014 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j4xn\" (UniqueName: \"kubernetes.io/projected/6fa7a6c6-3ce1-4b23-a647-6f89199cd038-kube-api-access-5j4xn\") pod \"certified-operators-97kq4\" (UID: \"6fa7a6c6-3ce1-4b23-a647-6f89199cd038\") " pod="openshift-marketplace/certified-operators-97kq4" Sep 30 08:34:52 crc kubenswrapper[4760]: I0930 08:34:52.798123 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fa7a6c6-3ce1-4b23-a647-6f89199cd038-utilities\") pod \"certified-operators-97kq4\" (UID: \"6fa7a6c6-3ce1-4b23-a647-6f89199cd038\") " pod="openshift-marketplace/certified-operators-97kq4" Sep 30 08:34:52 crc kubenswrapper[4760]: I0930 08:34:52.798334 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fa7a6c6-3ce1-4b23-a647-6f89199cd038-catalog-content\") pod \"certified-operators-97kq4\" (UID: \"6fa7a6c6-3ce1-4b23-a647-6f89199cd038\") " pod="openshift-marketplace/certified-operators-97kq4" Sep 30 08:34:52 crc kubenswrapper[4760]: I0930 08:34:52.899465 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fa7a6c6-3ce1-4b23-a647-6f89199cd038-catalog-content\") pod \"certified-operators-97kq4\" (UID: \"6fa7a6c6-3ce1-4b23-a647-6f89199cd038\") " pod="openshift-marketplace/certified-operators-97kq4" Sep 30 08:34:52 crc kubenswrapper[4760]: I0930 08:34:52.899813 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j4xn\" (UniqueName: \"kubernetes.io/projected/6fa7a6c6-3ce1-4b23-a647-6f89199cd038-kube-api-access-5j4xn\") pod \"certified-operators-97kq4\" (UID: \"6fa7a6c6-3ce1-4b23-a647-6f89199cd038\") " pod="openshift-marketplace/certified-operators-97kq4" Sep 30 08:34:52 crc kubenswrapper[4760]: I0930 08:34:52.899858 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fa7a6c6-3ce1-4b23-a647-6f89199cd038-utilities\") pod \"certified-operators-97kq4\" (UID: \"6fa7a6c6-3ce1-4b23-a647-6f89199cd038\") " pod="openshift-marketplace/certified-operators-97kq4" Sep 30 08:34:52 crc kubenswrapper[4760]: I0930 08:34:52.899979 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fa7a6c6-3ce1-4b23-a647-6f89199cd038-catalog-content\") pod \"certified-operators-97kq4\" (UID: \"6fa7a6c6-3ce1-4b23-a647-6f89199cd038\") " pod="openshift-marketplace/certified-operators-97kq4" Sep 30 08:34:52 crc kubenswrapper[4760]: I0930 08:34:52.900407 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fa7a6c6-3ce1-4b23-a647-6f89199cd038-utilities\") pod \"certified-operators-97kq4\" (UID: \"6fa7a6c6-3ce1-4b23-a647-6f89199cd038\") " pod="openshift-marketplace/certified-operators-97kq4" Sep 30 08:34:52 crc kubenswrapper[4760]: I0930 08:34:52.925528 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j4xn\" (UniqueName: \"kubernetes.io/projected/6fa7a6c6-3ce1-4b23-a647-6f89199cd038-kube-api-access-5j4xn\") pod \"certified-operators-97kq4\" (UID: \"6fa7a6c6-3ce1-4b23-a647-6f89199cd038\") " pod="openshift-marketplace/certified-operators-97kq4" Sep 30 08:34:53 crc kubenswrapper[4760]: I0930 08:34:53.044249 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-97kq4" Sep 30 08:34:53 crc kubenswrapper[4760]: I0930 08:34:53.502443 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-97kq4"] Sep 30 08:34:53 crc kubenswrapper[4760]: W0930 08:34:53.511488 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fa7a6c6_3ce1_4b23_a647_6f89199cd038.slice/crio-94898720b93755b85a0a9739baf9739355964af8f9ef405d644289f3b12660eb WatchSource:0}: Error finding container 94898720b93755b85a0a9739baf9739355964af8f9ef405d644289f3b12660eb: Status 404 returned error can't find the container with id 94898720b93755b85a0a9739baf9739355964af8f9ef405d644289f3b12660eb Sep 30 08:34:53 crc kubenswrapper[4760]: I0930 08:34:53.916164 4760 generic.go:334] "Generic (PLEG): container finished" podID="6fa7a6c6-3ce1-4b23-a647-6f89199cd038" containerID="b2343b3ce23d8134376a900981cb0b327937273a426434443cdafe9465d7da63" exitCode=0 Sep 30 08:34:53 crc kubenswrapper[4760]: I0930 08:34:53.916234 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97kq4" event={"ID":"6fa7a6c6-3ce1-4b23-a647-6f89199cd038","Type":"ContainerDied","Data":"b2343b3ce23d8134376a900981cb0b327937273a426434443cdafe9465d7da63"} Sep 30 08:34:53 crc kubenswrapper[4760]: I0930 08:34:53.916682 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97kq4" event={"ID":"6fa7a6c6-3ce1-4b23-a647-6f89199cd038","Type":"ContainerStarted","Data":"94898720b93755b85a0a9739baf9739355964af8f9ef405d644289f3b12660eb"} Sep 30 08:34:54 crc kubenswrapper[4760]: I0930 08:34:54.931943 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97kq4" event={"ID":"6fa7a6c6-3ce1-4b23-a647-6f89199cd038","Type":"ContainerStarted","Data":"566a55312180777bf28eff2634d47c7354ccc79a1b4b0cda3b3f11a8c1e2f177"} Sep 30 08:34:55 crc kubenswrapper[4760]: I0930 08:34:55.946139 4760 generic.go:334] "Generic (PLEG): container finished" podID="6fa7a6c6-3ce1-4b23-a647-6f89199cd038" containerID="566a55312180777bf28eff2634d47c7354ccc79a1b4b0cda3b3f11a8c1e2f177" exitCode=0 Sep 30 08:34:55 crc kubenswrapper[4760]: I0930 08:34:55.946274 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97kq4" event={"ID":"6fa7a6c6-3ce1-4b23-a647-6f89199cd038","Type":"ContainerDied","Data":"566a55312180777bf28eff2634d47c7354ccc79a1b4b0cda3b3f11a8c1e2f177"} Sep 30 08:34:57 crc kubenswrapper[4760]: I0930 08:34:57.969823 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97kq4" event={"ID":"6fa7a6c6-3ce1-4b23-a647-6f89199cd038","Type":"ContainerStarted","Data":"30dd438b66c27ba8064ff209f349bca10af2994ef057c57018a26ba78af4019f"} Sep 30 08:34:57 crc kubenswrapper[4760]: I0930 08:34:57.991674 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-97kq4" podStartSLOduration=3.185198269 podStartE2EDuration="5.991652308s" podCreationTimestamp="2025-09-30 08:34:52 +0000 UTC" firstStartedPulling="2025-09-30 08:34:53.9184844 +0000 UTC m=+3679.561390822" lastFinishedPulling="2025-09-30 08:34:56.724938399 +0000 UTC m=+3682.367844861" observedRunningTime="2025-09-30 08:34:57.984814854 +0000 UTC m=+3683.627721286" watchObservedRunningTime="2025-09-30 08:34:57.991652308 +0000 UTC m=+3683.634558730" Sep 30 08:35:03 crc kubenswrapper[4760]: I0930 08:35:03.044873 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-97kq4" Sep 30 08:35:03 crc kubenswrapper[4760]: I0930 08:35:03.045608 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-97kq4" Sep 30 08:35:03 crc kubenswrapper[4760]: I0930 08:35:03.135848 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-97kq4" Sep 30 08:35:04 crc kubenswrapper[4760]: I0930 08:35:04.067427 4760 scope.go:117] "RemoveContainer" containerID="3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a" Sep 30 08:35:04 crc kubenswrapper[4760]: E0930 08:35:04.068460 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:35:04 crc kubenswrapper[4760]: I0930 08:35:04.099917 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-97kq4" Sep 30 08:35:04 crc kubenswrapper[4760]: I0930 08:35:04.152476 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-97kq4"] Sep 30 08:35:06 crc kubenswrapper[4760]: I0930 08:35:06.055226 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-97kq4" podUID="6fa7a6c6-3ce1-4b23-a647-6f89199cd038" containerName="registry-server" containerID="cri-o://30dd438b66c27ba8064ff209f349bca10af2994ef057c57018a26ba78af4019f" gracePeriod=2 Sep 30 08:35:06 crc kubenswrapper[4760]: I0930 08:35:06.655106 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-97kq4" Sep 30 08:35:06 crc kubenswrapper[4760]: I0930 08:35:06.805878 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fa7a6c6-3ce1-4b23-a647-6f89199cd038-utilities\") pod \"6fa7a6c6-3ce1-4b23-a647-6f89199cd038\" (UID: \"6fa7a6c6-3ce1-4b23-a647-6f89199cd038\") " Sep 30 08:35:06 crc kubenswrapper[4760]: I0930 08:35:06.805949 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fa7a6c6-3ce1-4b23-a647-6f89199cd038-catalog-content\") pod \"6fa7a6c6-3ce1-4b23-a647-6f89199cd038\" (UID: \"6fa7a6c6-3ce1-4b23-a647-6f89199cd038\") " Sep 30 08:35:06 crc kubenswrapper[4760]: I0930 08:35:06.806042 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j4xn\" (UniqueName: \"kubernetes.io/projected/6fa7a6c6-3ce1-4b23-a647-6f89199cd038-kube-api-access-5j4xn\") pod \"6fa7a6c6-3ce1-4b23-a647-6f89199cd038\" (UID: \"6fa7a6c6-3ce1-4b23-a647-6f89199cd038\") " Sep 30 08:35:06 crc kubenswrapper[4760]: I0930 08:35:06.807498 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fa7a6c6-3ce1-4b23-a647-6f89199cd038-utilities" (OuterVolumeSpecName: "utilities") pod "6fa7a6c6-3ce1-4b23-a647-6f89199cd038" (UID: "6fa7a6c6-3ce1-4b23-a647-6f89199cd038"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:35:06 crc kubenswrapper[4760]: I0930 08:35:06.809200 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fa7a6c6-3ce1-4b23-a647-6f89199cd038-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 08:35:06 crc kubenswrapper[4760]: I0930 08:35:06.813058 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fa7a6c6-3ce1-4b23-a647-6f89199cd038-kube-api-access-5j4xn" (OuterVolumeSpecName: "kube-api-access-5j4xn") pod "6fa7a6c6-3ce1-4b23-a647-6f89199cd038" (UID: "6fa7a6c6-3ce1-4b23-a647-6f89199cd038"). InnerVolumeSpecName "kube-api-access-5j4xn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:35:06 crc kubenswrapper[4760]: I0930 08:35:06.871941 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fa7a6c6-3ce1-4b23-a647-6f89199cd038-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6fa7a6c6-3ce1-4b23-a647-6f89199cd038" (UID: "6fa7a6c6-3ce1-4b23-a647-6f89199cd038"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:35:06 crc kubenswrapper[4760]: I0930 08:35:06.912980 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fa7a6c6-3ce1-4b23-a647-6f89199cd038-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 08:35:06 crc kubenswrapper[4760]: I0930 08:35:06.913036 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j4xn\" (UniqueName: \"kubernetes.io/projected/6fa7a6c6-3ce1-4b23-a647-6f89199cd038-kube-api-access-5j4xn\") on node \"crc\" DevicePath \"\"" Sep 30 08:35:07 crc kubenswrapper[4760]: I0930 08:35:07.073601 4760 generic.go:334] "Generic (PLEG): container finished" podID="6fa7a6c6-3ce1-4b23-a647-6f89199cd038" containerID="30dd438b66c27ba8064ff209f349bca10af2994ef057c57018a26ba78af4019f" exitCode=0 Sep 30 08:35:07 crc kubenswrapper[4760]: I0930 08:35:07.073745 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-97kq4" Sep 30 08:35:07 crc kubenswrapper[4760]: I0930 08:35:07.088416 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97kq4" event={"ID":"6fa7a6c6-3ce1-4b23-a647-6f89199cd038","Type":"ContainerDied","Data":"30dd438b66c27ba8064ff209f349bca10af2994ef057c57018a26ba78af4019f"} Sep 30 08:35:07 crc kubenswrapper[4760]: I0930 08:35:07.088497 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97kq4" event={"ID":"6fa7a6c6-3ce1-4b23-a647-6f89199cd038","Type":"ContainerDied","Data":"94898720b93755b85a0a9739baf9739355964af8f9ef405d644289f3b12660eb"} Sep 30 08:35:07 crc kubenswrapper[4760]: I0930 08:35:07.088539 4760 scope.go:117] "RemoveContainer" containerID="30dd438b66c27ba8064ff209f349bca10af2994ef057c57018a26ba78af4019f" Sep 30 08:35:07 crc kubenswrapper[4760]: I0930 08:35:07.133836 4760 scope.go:117] "RemoveContainer" containerID="566a55312180777bf28eff2634d47c7354ccc79a1b4b0cda3b3f11a8c1e2f177" Sep 30 08:35:07 crc kubenswrapper[4760]: I0930 08:35:07.146990 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-97kq4"] Sep 30 08:35:07 crc kubenswrapper[4760]: I0930 08:35:07.164505 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-97kq4"] Sep 30 08:35:07 crc kubenswrapper[4760]: I0930 08:35:07.171112 4760 scope.go:117] "RemoveContainer" containerID="b2343b3ce23d8134376a900981cb0b327937273a426434443cdafe9465d7da63" Sep 30 08:35:07 crc kubenswrapper[4760]: I0930 08:35:07.269646 4760 scope.go:117] "RemoveContainer" containerID="30dd438b66c27ba8064ff209f349bca10af2994ef057c57018a26ba78af4019f" Sep 30 08:35:07 crc kubenswrapper[4760]: E0930 08:35:07.270706 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30dd438b66c27ba8064ff209f349bca10af2994ef057c57018a26ba78af4019f\": container with ID starting with 30dd438b66c27ba8064ff209f349bca10af2994ef057c57018a26ba78af4019f not found: ID does not exist" containerID="30dd438b66c27ba8064ff209f349bca10af2994ef057c57018a26ba78af4019f" Sep 30 08:35:07 crc kubenswrapper[4760]: I0930 08:35:07.270759 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30dd438b66c27ba8064ff209f349bca10af2994ef057c57018a26ba78af4019f"} err="failed to get container status \"30dd438b66c27ba8064ff209f349bca10af2994ef057c57018a26ba78af4019f\": rpc error: code = NotFound desc = could not find container \"30dd438b66c27ba8064ff209f349bca10af2994ef057c57018a26ba78af4019f\": container with ID starting with 30dd438b66c27ba8064ff209f349bca10af2994ef057c57018a26ba78af4019f not found: ID does not exist" Sep 30 08:35:07 crc kubenswrapper[4760]: I0930 08:35:07.270793 4760 scope.go:117] "RemoveContainer" containerID="566a55312180777bf28eff2634d47c7354ccc79a1b4b0cda3b3f11a8c1e2f177" Sep 30 08:35:07 crc kubenswrapper[4760]: E0930 08:35:07.271046 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"566a55312180777bf28eff2634d47c7354ccc79a1b4b0cda3b3f11a8c1e2f177\": container with ID starting with 566a55312180777bf28eff2634d47c7354ccc79a1b4b0cda3b3f11a8c1e2f177 not found: ID does not exist" containerID="566a55312180777bf28eff2634d47c7354ccc79a1b4b0cda3b3f11a8c1e2f177" Sep 30 08:35:07 crc kubenswrapper[4760]: I0930 08:35:07.271090 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"566a55312180777bf28eff2634d47c7354ccc79a1b4b0cda3b3f11a8c1e2f177"} err="failed to get container status \"566a55312180777bf28eff2634d47c7354ccc79a1b4b0cda3b3f11a8c1e2f177\": rpc error: code = NotFound desc = could not find container \"566a55312180777bf28eff2634d47c7354ccc79a1b4b0cda3b3f11a8c1e2f177\": container with ID starting with 566a55312180777bf28eff2634d47c7354ccc79a1b4b0cda3b3f11a8c1e2f177 not found: ID does not exist" Sep 30 08:35:07 crc kubenswrapper[4760]: I0930 08:35:07.271114 4760 scope.go:117] "RemoveContainer" containerID="b2343b3ce23d8134376a900981cb0b327937273a426434443cdafe9465d7da63" Sep 30 08:35:07 crc kubenswrapper[4760]: E0930 08:35:07.271569 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2343b3ce23d8134376a900981cb0b327937273a426434443cdafe9465d7da63\": container with ID starting with b2343b3ce23d8134376a900981cb0b327937273a426434443cdafe9465d7da63 not found: ID does not exist" containerID="b2343b3ce23d8134376a900981cb0b327937273a426434443cdafe9465d7da63" Sep 30 08:35:07 crc kubenswrapper[4760]: I0930 08:35:07.271615 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2343b3ce23d8134376a900981cb0b327937273a426434443cdafe9465d7da63"} err="failed to get container status \"b2343b3ce23d8134376a900981cb0b327937273a426434443cdafe9465d7da63\": rpc error: code = NotFound desc = could not find container \"b2343b3ce23d8134376a900981cb0b327937273a426434443cdafe9465d7da63\": container with ID starting with b2343b3ce23d8134376a900981cb0b327937273a426434443cdafe9465d7da63 not found: ID does not exist" Sep 30 08:35:09 crc kubenswrapper[4760]: I0930 08:35:09.084905 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fa7a6c6-3ce1-4b23-a647-6f89199cd038" path="/var/lib/kubelet/pods/6fa7a6c6-3ce1-4b23-a647-6f89199cd038/volumes" Sep 30 08:35:19 crc kubenswrapper[4760]: I0930 08:35:19.069219 4760 scope.go:117] "RemoveContainer" containerID="3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a" Sep 30 08:35:19 crc kubenswrapper[4760]: E0930 08:35:19.069913 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:35:32 crc kubenswrapper[4760]: I0930 08:35:32.067041 4760 scope.go:117] "RemoveContainer" containerID="3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a" Sep 30 08:35:32 crc kubenswrapper[4760]: E0930 08:35:32.068159 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:35:44 crc kubenswrapper[4760]: I0930 08:35:44.067661 4760 scope.go:117] "RemoveContainer" containerID="3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a" Sep 30 08:35:44 crc kubenswrapper[4760]: E0930 08:35:44.068957 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:35:58 crc kubenswrapper[4760]: I0930 08:35:58.067037 4760 scope.go:117] "RemoveContainer" containerID="3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a" Sep 30 08:35:58 crc kubenswrapper[4760]: E0930 08:35:58.067946 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:36:08 crc kubenswrapper[4760]: I0930 08:36:08.353061 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n757x"] Sep 30 08:36:08 crc kubenswrapper[4760]: E0930 08:36:08.354270 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fa7a6c6-3ce1-4b23-a647-6f89199cd038" containerName="extract-content" Sep 30 08:36:08 crc kubenswrapper[4760]: I0930 08:36:08.354293 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fa7a6c6-3ce1-4b23-a647-6f89199cd038" containerName="extract-content" Sep 30 08:36:08 crc kubenswrapper[4760]: E0930 08:36:08.354355 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fa7a6c6-3ce1-4b23-a647-6f89199cd038" containerName="extract-utilities" Sep 30 08:36:08 crc kubenswrapper[4760]: I0930 08:36:08.354368 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fa7a6c6-3ce1-4b23-a647-6f89199cd038" containerName="extract-utilities" Sep 30 08:36:08 crc kubenswrapper[4760]: E0930 08:36:08.354392 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fa7a6c6-3ce1-4b23-a647-6f89199cd038" containerName="registry-server" Sep 30 08:36:08 crc kubenswrapper[4760]: I0930 08:36:08.354404 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fa7a6c6-3ce1-4b23-a647-6f89199cd038" containerName="registry-server" Sep 30 08:36:08 crc kubenswrapper[4760]: I0930 08:36:08.354673 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fa7a6c6-3ce1-4b23-a647-6f89199cd038" containerName="registry-server" Sep 30 08:36:08 crc kubenswrapper[4760]: I0930 08:36:08.356739 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n757x" Sep 30 08:36:08 crc kubenswrapper[4760]: I0930 08:36:08.364541 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n757x"] Sep 30 08:36:08 crc kubenswrapper[4760]: I0930 08:36:08.420000 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53856d8f-8edd-46e4-a0c0-c04bdfec3c40-catalog-content\") pod \"redhat-marketplace-n757x\" (UID: \"53856d8f-8edd-46e4-a0c0-c04bdfec3c40\") " pod="openshift-marketplace/redhat-marketplace-n757x" Sep 30 08:36:08 crc kubenswrapper[4760]: I0930 08:36:08.420242 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53856d8f-8edd-46e4-a0c0-c04bdfec3c40-utilities\") pod \"redhat-marketplace-n757x\" (UID: \"53856d8f-8edd-46e4-a0c0-c04bdfec3c40\") " pod="openshift-marketplace/redhat-marketplace-n757x" Sep 30 08:36:08 crc kubenswrapper[4760]: I0930 08:36:08.420521 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfwvx\" (UniqueName: \"kubernetes.io/projected/53856d8f-8edd-46e4-a0c0-c04bdfec3c40-kube-api-access-cfwvx\") pod \"redhat-marketplace-n757x\" (UID: \"53856d8f-8edd-46e4-a0c0-c04bdfec3c40\") " pod="openshift-marketplace/redhat-marketplace-n757x" Sep 30 08:36:08 crc kubenswrapper[4760]: I0930 08:36:08.522070 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfwvx\" (UniqueName: \"kubernetes.io/projected/53856d8f-8edd-46e4-a0c0-c04bdfec3c40-kube-api-access-cfwvx\") pod \"redhat-marketplace-n757x\" (UID: \"53856d8f-8edd-46e4-a0c0-c04bdfec3c40\") " pod="openshift-marketplace/redhat-marketplace-n757x" Sep 30 08:36:08 crc kubenswrapper[4760]: I0930 08:36:08.522169 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53856d8f-8edd-46e4-a0c0-c04bdfec3c40-catalog-content\") pod \"redhat-marketplace-n757x\" (UID: \"53856d8f-8edd-46e4-a0c0-c04bdfec3c40\") " pod="openshift-marketplace/redhat-marketplace-n757x" Sep 30 08:36:08 crc kubenswrapper[4760]: I0930 08:36:08.522692 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53856d8f-8edd-46e4-a0c0-c04bdfec3c40-catalog-content\") pod \"redhat-marketplace-n757x\" (UID: \"53856d8f-8edd-46e4-a0c0-c04bdfec3c40\") " pod="openshift-marketplace/redhat-marketplace-n757x" Sep 30 08:36:08 crc kubenswrapper[4760]: I0930 08:36:08.523041 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53856d8f-8edd-46e4-a0c0-c04bdfec3c40-utilities\") pod \"redhat-marketplace-n757x\" (UID: \"53856d8f-8edd-46e4-a0c0-c04bdfec3c40\") " pod="openshift-marketplace/redhat-marketplace-n757x" Sep 30 08:36:08 crc kubenswrapper[4760]: I0930 08:36:08.522786 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53856d8f-8edd-46e4-a0c0-c04bdfec3c40-utilities\") pod \"redhat-marketplace-n757x\" (UID: \"53856d8f-8edd-46e4-a0c0-c04bdfec3c40\") " pod="openshift-marketplace/redhat-marketplace-n757x" Sep 30 08:36:08 crc kubenswrapper[4760]: I0930 08:36:08.545429 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfwvx\" (UniqueName: \"kubernetes.io/projected/53856d8f-8edd-46e4-a0c0-c04bdfec3c40-kube-api-access-cfwvx\") pod \"redhat-marketplace-n757x\" (UID: \"53856d8f-8edd-46e4-a0c0-c04bdfec3c40\") " pod="openshift-marketplace/redhat-marketplace-n757x" Sep 30 08:36:08 crc kubenswrapper[4760]: I0930 08:36:08.687248 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n757x" Sep 30 08:36:09 crc kubenswrapper[4760]: W0930 08:36:09.137990 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53856d8f_8edd_46e4_a0c0_c04bdfec3c40.slice/crio-23ca2a189f8297e1f19da63641acbb75de3cbf8152955b25e75c37b3c9cc97a7 WatchSource:0}: Error finding container 23ca2a189f8297e1f19da63641acbb75de3cbf8152955b25e75c37b3c9cc97a7: Status 404 returned error can't find the container with id 23ca2a189f8297e1f19da63641acbb75de3cbf8152955b25e75c37b3c9cc97a7 Sep 30 08:36:09 crc kubenswrapper[4760]: I0930 08:36:09.158661 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n757x"] Sep 30 08:36:09 crc kubenswrapper[4760]: I0930 08:36:09.811611 4760 generic.go:334] "Generic (PLEG): container finished" podID="53856d8f-8edd-46e4-a0c0-c04bdfec3c40" containerID="e4d37a85a3e8435d08d571c19ddbf88256b001a75d30b5b8d74103f9380e4bae" exitCode=0 Sep 30 08:36:09 crc kubenswrapper[4760]: I0930 08:36:09.811680 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n757x" event={"ID":"53856d8f-8edd-46e4-a0c0-c04bdfec3c40","Type":"ContainerDied","Data":"e4d37a85a3e8435d08d571c19ddbf88256b001a75d30b5b8d74103f9380e4bae"} Sep 30 08:36:09 crc kubenswrapper[4760]: I0930 08:36:09.815099 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n757x" event={"ID":"53856d8f-8edd-46e4-a0c0-c04bdfec3c40","Type":"ContainerStarted","Data":"23ca2a189f8297e1f19da63641acbb75de3cbf8152955b25e75c37b3c9cc97a7"} Sep 30 08:36:10 crc kubenswrapper[4760]: I0930 08:36:10.832316 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n757x" event={"ID":"53856d8f-8edd-46e4-a0c0-c04bdfec3c40","Type":"ContainerStarted","Data":"ac92d3dd7ecf7988f28b557d672e0fd57df510d28da819140558a9e44213dd52"} Sep 30 08:36:11 crc kubenswrapper[4760]: I0930 08:36:11.842964 4760 generic.go:334] "Generic (PLEG): container finished" podID="53856d8f-8edd-46e4-a0c0-c04bdfec3c40" containerID="ac92d3dd7ecf7988f28b557d672e0fd57df510d28da819140558a9e44213dd52" exitCode=0 Sep 30 08:36:11 crc kubenswrapper[4760]: I0930 08:36:11.843004 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n757x" event={"ID":"53856d8f-8edd-46e4-a0c0-c04bdfec3c40","Type":"ContainerDied","Data":"ac92d3dd7ecf7988f28b557d672e0fd57df510d28da819140558a9e44213dd52"} Sep 30 08:36:12 crc kubenswrapper[4760]: I0930 08:36:12.858716 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n757x" event={"ID":"53856d8f-8edd-46e4-a0c0-c04bdfec3c40","Type":"ContainerStarted","Data":"f1f64c2502db18a99295b0ed4f9b52f04ffaf57a18cc8c44683371d50eeab6e9"} Sep 30 08:36:12 crc kubenswrapper[4760]: I0930 08:36:12.893553 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n757x" podStartSLOduration=2.381708708 podStartE2EDuration="4.893534887s" podCreationTimestamp="2025-09-30 08:36:08 +0000 UTC" firstStartedPulling="2025-09-30 08:36:09.813295976 +0000 UTC m=+3755.456202388" lastFinishedPulling="2025-09-30 08:36:12.325122145 +0000 UTC m=+3757.968028567" observedRunningTime="2025-09-30 08:36:12.882643368 +0000 UTC m=+3758.525549810" watchObservedRunningTime="2025-09-30 08:36:12.893534887 +0000 UTC m=+3758.536441299" Sep 30 08:36:13 crc kubenswrapper[4760]: I0930 08:36:13.067560 4760 scope.go:117] "RemoveContainer" containerID="3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a" Sep 30 08:36:13 crc kubenswrapper[4760]: E0930 08:36:13.067820 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:36:18 crc kubenswrapper[4760]: I0930 08:36:18.687980 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n757x" Sep 30 08:36:18 crc kubenswrapper[4760]: I0930 08:36:18.689469 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n757x" Sep 30 08:36:18 crc kubenswrapper[4760]: I0930 08:36:18.754985 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n757x" Sep 30 08:36:19 crc kubenswrapper[4760]: I0930 08:36:19.001804 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n757x" Sep 30 08:36:19 crc kubenswrapper[4760]: I0930 08:36:19.060885 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n757x"] Sep 30 08:36:20 crc kubenswrapper[4760]: I0930 08:36:20.963454 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n757x" podUID="53856d8f-8edd-46e4-a0c0-c04bdfec3c40" containerName="registry-server" containerID="cri-o://f1f64c2502db18a99295b0ed4f9b52f04ffaf57a18cc8c44683371d50eeab6e9" gracePeriod=2 Sep 30 08:36:21 crc kubenswrapper[4760]: I0930 08:36:21.465348 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n757x" Sep 30 08:36:21 crc kubenswrapper[4760]: I0930 08:36:21.621001 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfwvx\" (UniqueName: \"kubernetes.io/projected/53856d8f-8edd-46e4-a0c0-c04bdfec3c40-kube-api-access-cfwvx\") pod \"53856d8f-8edd-46e4-a0c0-c04bdfec3c40\" (UID: \"53856d8f-8edd-46e4-a0c0-c04bdfec3c40\") " Sep 30 08:36:21 crc kubenswrapper[4760]: I0930 08:36:21.621573 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53856d8f-8edd-46e4-a0c0-c04bdfec3c40-catalog-content\") pod \"53856d8f-8edd-46e4-a0c0-c04bdfec3c40\" (UID: \"53856d8f-8edd-46e4-a0c0-c04bdfec3c40\") " Sep 30 08:36:21 crc kubenswrapper[4760]: I0930 08:36:21.622465 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53856d8f-8edd-46e4-a0c0-c04bdfec3c40-utilities\") pod \"53856d8f-8edd-46e4-a0c0-c04bdfec3c40\" (UID: \"53856d8f-8edd-46e4-a0c0-c04bdfec3c40\") " Sep 30 08:36:21 crc kubenswrapper[4760]: I0930 08:36:21.623067 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53856d8f-8edd-46e4-a0c0-c04bdfec3c40-utilities" (OuterVolumeSpecName: "utilities") pod "53856d8f-8edd-46e4-a0c0-c04bdfec3c40" (UID: "53856d8f-8edd-46e4-a0c0-c04bdfec3c40"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:36:21 crc kubenswrapper[4760]: I0930 08:36:21.623208 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53856d8f-8edd-46e4-a0c0-c04bdfec3c40-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 08:36:21 crc kubenswrapper[4760]: I0930 08:36:21.625939 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53856d8f-8edd-46e4-a0c0-c04bdfec3c40-kube-api-access-cfwvx" (OuterVolumeSpecName: "kube-api-access-cfwvx") pod "53856d8f-8edd-46e4-a0c0-c04bdfec3c40" (UID: "53856d8f-8edd-46e4-a0c0-c04bdfec3c40"). InnerVolumeSpecName "kube-api-access-cfwvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:36:21 crc kubenswrapper[4760]: I0930 08:36:21.633250 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53856d8f-8edd-46e4-a0c0-c04bdfec3c40-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "53856d8f-8edd-46e4-a0c0-c04bdfec3c40" (UID: "53856d8f-8edd-46e4-a0c0-c04bdfec3c40"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:36:21 crc kubenswrapper[4760]: I0930 08:36:21.724952 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53856d8f-8edd-46e4-a0c0-c04bdfec3c40-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 08:36:21 crc kubenswrapper[4760]: I0930 08:36:21.724987 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfwvx\" (UniqueName: \"kubernetes.io/projected/53856d8f-8edd-46e4-a0c0-c04bdfec3c40-kube-api-access-cfwvx\") on node \"crc\" DevicePath \"\"" Sep 30 08:36:21 crc kubenswrapper[4760]: I0930 08:36:21.979152 4760 generic.go:334] "Generic (PLEG): container finished" podID="53856d8f-8edd-46e4-a0c0-c04bdfec3c40" containerID="f1f64c2502db18a99295b0ed4f9b52f04ffaf57a18cc8c44683371d50eeab6e9" exitCode=0 Sep 30 08:36:21 crc kubenswrapper[4760]: I0930 08:36:21.979236 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n757x" Sep 30 08:36:21 crc kubenswrapper[4760]: I0930 08:36:21.979255 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n757x" event={"ID":"53856d8f-8edd-46e4-a0c0-c04bdfec3c40","Type":"ContainerDied","Data":"f1f64c2502db18a99295b0ed4f9b52f04ffaf57a18cc8c44683371d50eeab6e9"} Sep 30 08:36:21 crc kubenswrapper[4760]: I0930 08:36:21.980576 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n757x" event={"ID":"53856d8f-8edd-46e4-a0c0-c04bdfec3c40","Type":"ContainerDied","Data":"23ca2a189f8297e1f19da63641acbb75de3cbf8152955b25e75c37b3c9cc97a7"} Sep 30 08:36:21 crc kubenswrapper[4760]: I0930 08:36:21.980615 4760 scope.go:117] "RemoveContainer" containerID="f1f64c2502db18a99295b0ed4f9b52f04ffaf57a18cc8c44683371d50eeab6e9" Sep 30 08:36:22 crc kubenswrapper[4760]: I0930 08:36:22.011730 4760 scope.go:117] "RemoveContainer" containerID="ac92d3dd7ecf7988f28b557d672e0fd57df510d28da819140558a9e44213dd52" Sep 30 08:36:22 crc kubenswrapper[4760]: I0930 08:36:22.026043 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n757x"] Sep 30 08:36:22 crc kubenswrapper[4760]: I0930 08:36:22.047560 4760 scope.go:117] "RemoveContainer" containerID="e4d37a85a3e8435d08d571c19ddbf88256b001a75d30b5b8d74103f9380e4bae" Sep 30 08:36:22 crc kubenswrapper[4760]: I0930 08:36:22.054481 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n757x"] Sep 30 08:36:22 crc kubenswrapper[4760]: I0930 08:36:22.101079 4760 scope.go:117] "RemoveContainer" containerID="f1f64c2502db18a99295b0ed4f9b52f04ffaf57a18cc8c44683371d50eeab6e9" Sep 30 08:36:22 crc kubenswrapper[4760]: E0930 08:36:22.101682 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1f64c2502db18a99295b0ed4f9b52f04ffaf57a18cc8c44683371d50eeab6e9\": container with ID starting with f1f64c2502db18a99295b0ed4f9b52f04ffaf57a18cc8c44683371d50eeab6e9 not found: ID does not exist" containerID="f1f64c2502db18a99295b0ed4f9b52f04ffaf57a18cc8c44683371d50eeab6e9" Sep 30 08:36:22 crc kubenswrapper[4760]: I0930 08:36:22.101740 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1f64c2502db18a99295b0ed4f9b52f04ffaf57a18cc8c44683371d50eeab6e9"} err="failed to get container status \"f1f64c2502db18a99295b0ed4f9b52f04ffaf57a18cc8c44683371d50eeab6e9\": rpc error: code = NotFound desc = could not find container \"f1f64c2502db18a99295b0ed4f9b52f04ffaf57a18cc8c44683371d50eeab6e9\": container with ID starting with f1f64c2502db18a99295b0ed4f9b52f04ffaf57a18cc8c44683371d50eeab6e9 not found: ID does not exist" Sep 30 08:36:22 crc kubenswrapper[4760]: I0930 08:36:22.101777 4760 scope.go:117] "RemoveContainer" containerID="ac92d3dd7ecf7988f28b557d672e0fd57df510d28da819140558a9e44213dd52" Sep 30 08:36:22 crc kubenswrapper[4760]: E0930 08:36:22.102216 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac92d3dd7ecf7988f28b557d672e0fd57df510d28da819140558a9e44213dd52\": container with ID starting with ac92d3dd7ecf7988f28b557d672e0fd57df510d28da819140558a9e44213dd52 not found: ID does not exist" containerID="ac92d3dd7ecf7988f28b557d672e0fd57df510d28da819140558a9e44213dd52" Sep 30 08:36:22 crc kubenswrapper[4760]: I0930 08:36:22.102257 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac92d3dd7ecf7988f28b557d672e0fd57df510d28da819140558a9e44213dd52"} err="failed to get container status \"ac92d3dd7ecf7988f28b557d672e0fd57df510d28da819140558a9e44213dd52\": rpc error: code = NotFound desc = could not find container \"ac92d3dd7ecf7988f28b557d672e0fd57df510d28da819140558a9e44213dd52\": container with ID starting with ac92d3dd7ecf7988f28b557d672e0fd57df510d28da819140558a9e44213dd52 not found: ID does not exist" Sep 30 08:36:22 crc kubenswrapper[4760]: I0930 08:36:22.102292 4760 scope.go:117] "RemoveContainer" containerID="e4d37a85a3e8435d08d571c19ddbf88256b001a75d30b5b8d74103f9380e4bae" Sep 30 08:36:22 crc kubenswrapper[4760]: E0930 08:36:22.102626 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4d37a85a3e8435d08d571c19ddbf88256b001a75d30b5b8d74103f9380e4bae\": container with ID starting with e4d37a85a3e8435d08d571c19ddbf88256b001a75d30b5b8d74103f9380e4bae not found: ID does not exist" containerID="e4d37a85a3e8435d08d571c19ddbf88256b001a75d30b5b8d74103f9380e4bae" Sep 30 08:36:22 crc kubenswrapper[4760]: I0930 08:36:22.102665 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4d37a85a3e8435d08d571c19ddbf88256b001a75d30b5b8d74103f9380e4bae"} err="failed to get container status \"e4d37a85a3e8435d08d571c19ddbf88256b001a75d30b5b8d74103f9380e4bae\": rpc error: code = NotFound desc = could not find container \"e4d37a85a3e8435d08d571c19ddbf88256b001a75d30b5b8d74103f9380e4bae\": container with ID starting with e4d37a85a3e8435d08d571c19ddbf88256b001a75d30b5b8d74103f9380e4bae not found: ID does not exist" Sep 30 08:36:23 crc kubenswrapper[4760]: I0930 08:36:23.082472 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53856d8f-8edd-46e4-a0c0-c04bdfec3c40" path="/var/lib/kubelet/pods/53856d8f-8edd-46e4-a0c0-c04bdfec3c40/volumes" Sep 30 08:36:28 crc kubenswrapper[4760]: I0930 08:36:28.067247 4760 scope.go:117] "RemoveContainer" containerID="3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a" Sep 30 08:36:28 crc kubenswrapper[4760]: E0930 08:36:28.068166 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:36:40 crc kubenswrapper[4760]: I0930 08:36:40.066839 4760 scope.go:117] "RemoveContainer" containerID="3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a" Sep 30 08:36:40 crc kubenswrapper[4760]: E0930 08:36:40.067951 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:36:55 crc kubenswrapper[4760]: I0930 08:36:55.078898 4760 scope.go:117] "RemoveContainer" containerID="3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a" Sep 30 08:36:55 crc kubenswrapper[4760]: E0930 08:36:55.080006 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:37:08 crc kubenswrapper[4760]: I0930 08:37:08.066804 4760 scope.go:117] "RemoveContainer" containerID="3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a" Sep 30 08:37:08 crc kubenswrapper[4760]: E0930 08:37:08.067574 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:37:21 crc kubenswrapper[4760]: I0930 08:37:21.067082 4760 scope.go:117] "RemoveContainer" containerID="3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a" Sep 30 08:37:21 crc kubenswrapper[4760]: E0930 08:37:21.070494 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:37:32 crc kubenswrapper[4760]: I0930 08:37:32.067283 4760 scope.go:117] "RemoveContainer" containerID="3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a" Sep 30 08:37:32 crc kubenswrapper[4760]: E0930 08:37:32.068724 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:37:46 crc kubenswrapper[4760]: I0930 08:37:46.067850 4760 scope.go:117] "RemoveContainer" containerID="3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a" Sep 30 08:37:46 crc kubenswrapper[4760]: E0930 08:37:46.068886 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:37:58 crc kubenswrapper[4760]: I0930 08:37:58.066478 4760 scope.go:117] "RemoveContainer" containerID="3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a" Sep 30 08:37:58 crc kubenswrapper[4760]: E0930 08:37:58.067101 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:38:12 crc kubenswrapper[4760]: I0930 08:38:12.066680 4760 scope.go:117] "RemoveContainer" containerID="3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a" Sep 30 08:38:12 crc kubenswrapper[4760]: E0930 08:38:12.067741 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:38:12 crc kubenswrapper[4760]: E0930 08:38:12.987405 4760 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.201:36284->38.102.83.201:37703: write tcp 38.102.83.201:36284->38.102.83.201:37703: write: broken pipe Sep 30 08:38:24 crc kubenswrapper[4760]: I0930 08:38:24.067135 4760 scope.go:117] "RemoveContainer" containerID="3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a" Sep 30 08:38:24 crc kubenswrapper[4760]: I0930 08:38:24.452235 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerStarted","Data":"a39521fe028036755f0c31b8a95a27b7712f3d80af52024671ce6ebf04d62afa"} Sep 30 08:40:49 crc kubenswrapper[4760]: I0930 08:40:49.113607 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:40:49 crc kubenswrapper[4760]: I0930 08:40:49.114087 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:41:19 crc kubenswrapper[4760]: I0930 08:41:19.113071 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:41:19 crc kubenswrapper[4760]: I0930 08:41:19.113833 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:41:45 crc kubenswrapper[4760]: I0930 08:41:45.583471 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w2vs4"] Sep 30 08:41:45 crc kubenswrapper[4760]: E0930 08:41:45.584660 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53856d8f-8edd-46e4-a0c0-c04bdfec3c40" containerName="extract-content" Sep 30 08:41:45 crc kubenswrapper[4760]: I0930 08:41:45.584676 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="53856d8f-8edd-46e4-a0c0-c04bdfec3c40" containerName="extract-content" Sep 30 08:41:45 crc kubenswrapper[4760]: E0930 08:41:45.584712 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53856d8f-8edd-46e4-a0c0-c04bdfec3c40" containerName="registry-server" Sep 30 08:41:45 crc kubenswrapper[4760]: I0930 08:41:45.584721 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="53856d8f-8edd-46e4-a0c0-c04bdfec3c40" containerName="registry-server" Sep 30 08:41:45 crc kubenswrapper[4760]: E0930 08:41:45.584731 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53856d8f-8edd-46e4-a0c0-c04bdfec3c40" containerName="extract-utilities" Sep 30 08:41:45 crc kubenswrapper[4760]: I0930 08:41:45.584743 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="53856d8f-8edd-46e4-a0c0-c04bdfec3c40" containerName="extract-utilities" Sep 30 08:41:45 crc kubenswrapper[4760]: I0930 08:41:45.585009 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="53856d8f-8edd-46e4-a0c0-c04bdfec3c40" containerName="registry-server" Sep 30 08:41:45 crc kubenswrapper[4760]: I0930 08:41:45.586894 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w2vs4" Sep 30 08:41:45 crc kubenswrapper[4760]: I0930 08:41:45.591808 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w2vs4"] Sep 30 08:41:45 crc kubenswrapper[4760]: I0930 08:41:45.664885 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e71b2696-2321-4293-871c-3fa81d4a1094-utilities\") pod \"redhat-operators-w2vs4\" (UID: \"e71b2696-2321-4293-871c-3fa81d4a1094\") " pod="openshift-marketplace/redhat-operators-w2vs4" Sep 30 08:41:45 crc kubenswrapper[4760]: I0930 08:41:45.664926 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg9sx\" (UniqueName: \"kubernetes.io/projected/e71b2696-2321-4293-871c-3fa81d4a1094-kube-api-access-zg9sx\") pod \"redhat-operators-w2vs4\" (UID: \"e71b2696-2321-4293-871c-3fa81d4a1094\") " pod="openshift-marketplace/redhat-operators-w2vs4" Sep 30 08:41:45 crc kubenswrapper[4760]: I0930 08:41:45.664990 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e71b2696-2321-4293-871c-3fa81d4a1094-catalog-content\") pod \"redhat-operators-w2vs4\" (UID: \"e71b2696-2321-4293-871c-3fa81d4a1094\") " pod="openshift-marketplace/redhat-operators-w2vs4" Sep 30 08:41:45 crc kubenswrapper[4760]: I0930 08:41:45.766744 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e71b2696-2321-4293-871c-3fa81d4a1094-catalog-content\") pod \"redhat-operators-w2vs4\" (UID: \"e71b2696-2321-4293-871c-3fa81d4a1094\") " pod="openshift-marketplace/redhat-operators-w2vs4" Sep 30 08:41:45 crc kubenswrapper[4760]: I0930 08:41:45.766946 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e71b2696-2321-4293-871c-3fa81d4a1094-utilities\") pod \"redhat-operators-w2vs4\" (UID: \"e71b2696-2321-4293-871c-3fa81d4a1094\") " pod="openshift-marketplace/redhat-operators-w2vs4" Sep 30 08:41:45 crc kubenswrapper[4760]: I0930 08:41:45.766982 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg9sx\" (UniqueName: \"kubernetes.io/projected/e71b2696-2321-4293-871c-3fa81d4a1094-kube-api-access-zg9sx\") pod \"redhat-operators-w2vs4\" (UID: \"e71b2696-2321-4293-871c-3fa81d4a1094\") " pod="openshift-marketplace/redhat-operators-w2vs4" Sep 30 08:41:45 crc kubenswrapper[4760]: I0930 08:41:45.767309 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e71b2696-2321-4293-871c-3fa81d4a1094-catalog-content\") pod \"redhat-operators-w2vs4\" (UID: \"e71b2696-2321-4293-871c-3fa81d4a1094\") " pod="openshift-marketplace/redhat-operators-w2vs4" Sep 30 08:41:45 crc kubenswrapper[4760]: I0930 08:41:45.767398 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e71b2696-2321-4293-871c-3fa81d4a1094-utilities\") pod \"redhat-operators-w2vs4\" (UID: \"e71b2696-2321-4293-871c-3fa81d4a1094\") " pod="openshift-marketplace/redhat-operators-w2vs4" Sep 30 08:41:45 crc kubenswrapper[4760]: I0930 08:41:45.801710 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg9sx\" (UniqueName: \"kubernetes.io/projected/e71b2696-2321-4293-871c-3fa81d4a1094-kube-api-access-zg9sx\") pod \"redhat-operators-w2vs4\" (UID: \"e71b2696-2321-4293-871c-3fa81d4a1094\") " pod="openshift-marketplace/redhat-operators-w2vs4" Sep 30 08:41:45 crc kubenswrapper[4760]: I0930 08:41:45.956404 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w2vs4" Sep 30 08:41:46 crc kubenswrapper[4760]: I0930 08:41:46.482656 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w2vs4"] Sep 30 08:41:46 crc kubenswrapper[4760]: I0930 08:41:46.672805 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2vs4" event={"ID":"e71b2696-2321-4293-871c-3fa81d4a1094","Type":"ContainerStarted","Data":"ba3affcb615eeaff4fc70c483d51f2372600577e866a405fe0b8853e2c8bb2d4"} Sep 30 08:41:46 crc kubenswrapper[4760]: I0930 08:41:46.672844 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2vs4" event={"ID":"e71b2696-2321-4293-871c-3fa81d4a1094","Type":"ContainerStarted","Data":"5836d75e9d5dc03077c0822c1917ca9bc75fe90b1ae5e42e56207e5fcda72100"} Sep 30 08:41:47 crc kubenswrapper[4760]: I0930 08:41:47.683341 4760 generic.go:334] "Generic (PLEG): container finished" podID="e71b2696-2321-4293-871c-3fa81d4a1094" containerID="ba3affcb615eeaff4fc70c483d51f2372600577e866a405fe0b8853e2c8bb2d4" exitCode=0 Sep 30 08:41:47 crc kubenswrapper[4760]: I0930 08:41:47.683423 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2vs4" event={"ID":"e71b2696-2321-4293-871c-3fa81d4a1094","Type":"ContainerDied","Data":"ba3affcb615eeaff4fc70c483d51f2372600577e866a405fe0b8853e2c8bb2d4"} Sep 30 08:41:47 crc kubenswrapper[4760]: I0930 08:41:47.685566 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 08:41:49 crc kubenswrapper[4760]: I0930 08:41:49.112639 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:41:49 crc kubenswrapper[4760]: I0930 08:41:49.113041 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:41:49 crc kubenswrapper[4760]: I0930 08:41:49.113095 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 08:41:49 crc kubenswrapper[4760]: I0930 08:41:49.114141 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a39521fe028036755f0c31b8a95a27b7712f3d80af52024671ce6ebf04d62afa"} pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 08:41:49 crc kubenswrapper[4760]: I0930 08:41:49.114245 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" containerID="cri-o://a39521fe028036755f0c31b8a95a27b7712f3d80af52024671ce6ebf04d62afa" gracePeriod=600 Sep 30 08:41:49 crc kubenswrapper[4760]: I0930 08:41:49.710411 4760 generic.go:334] "Generic (PLEG): container finished" podID="7a9c8270-6964-4886-87d0-227b05b76da4" containerID="a39521fe028036755f0c31b8a95a27b7712f3d80af52024671ce6ebf04d62afa" exitCode=0 Sep 30 08:41:49 crc kubenswrapper[4760]: I0930 08:41:49.710511 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerDied","Data":"a39521fe028036755f0c31b8a95a27b7712f3d80af52024671ce6ebf04d62afa"} Sep 30 08:41:49 crc kubenswrapper[4760]: I0930 08:41:49.710884 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerStarted","Data":"131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b"} Sep 30 08:41:49 crc kubenswrapper[4760]: I0930 08:41:49.710902 4760 scope.go:117] "RemoveContainer" containerID="3b3ee338429c89d453b302bf27db16e924b60d460569221c449cab6e0c51d84a" Sep 30 08:41:49 crc kubenswrapper[4760]: I0930 08:41:49.716076 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2vs4" event={"ID":"e71b2696-2321-4293-871c-3fa81d4a1094","Type":"ContainerStarted","Data":"c6341a3689a0f45060680af6859fbfa301853a3f3167563914349f74cefd1b2f"} Sep 30 08:41:50 crc kubenswrapper[4760]: I0930 08:41:50.731958 4760 generic.go:334] "Generic (PLEG): container finished" podID="e71b2696-2321-4293-871c-3fa81d4a1094" containerID="c6341a3689a0f45060680af6859fbfa301853a3f3167563914349f74cefd1b2f" exitCode=0 Sep 30 08:41:50 crc kubenswrapper[4760]: I0930 08:41:50.732757 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2vs4" event={"ID":"e71b2696-2321-4293-871c-3fa81d4a1094","Type":"ContainerDied","Data":"c6341a3689a0f45060680af6859fbfa301853a3f3167563914349f74cefd1b2f"} Sep 30 08:41:51 crc kubenswrapper[4760]: I0930 08:41:51.752756 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2vs4" event={"ID":"e71b2696-2321-4293-871c-3fa81d4a1094","Type":"ContainerStarted","Data":"0cfc37d628990a575f4b7a1e65b7964599732c89f3a1e3191b360504d22ff475"} Sep 30 08:41:51 crc kubenswrapper[4760]: I0930 08:41:51.805192 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w2vs4" podStartSLOduration=3.281469169 podStartE2EDuration="6.805172775s" podCreationTimestamp="2025-09-30 08:41:45 +0000 UTC" firstStartedPulling="2025-09-30 08:41:47.685271892 +0000 UTC m=+4093.328178304" lastFinishedPulling="2025-09-30 08:41:51.208975488 +0000 UTC m=+4096.851881910" observedRunningTime="2025-09-30 08:41:51.79243942 +0000 UTC m=+4097.435345842" watchObservedRunningTime="2025-09-30 08:41:51.805172775 +0000 UTC m=+4097.448079207" Sep 30 08:41:55 crc kubenswrapper[4760]: I0930 08:41:55.957340 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w2vs4" Sep 30 08:41:55 crc kubenswrapper[4760]: I0930 08:41:55.957965 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w2vs4" Sep 30 08:41:57 crc kubenswrapper[4760]: I0930 08:41:57.016138 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w2vs4" podUID="e71b2696-2321-4293-871c-3fa81d4a1094" containerName="registry-server" probeResult="failure" output=< Sep 30 08:41:57 crc kubenswrapper[4760]: timeout: failed to connect service ":50051" within 1s Sep 30 08:41:57 crc kubenswrapper[4760]: > Sep 30 08:42:06 crc kubenswrapper[4760]: I0930 08:42:06.008920 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w2vs4" Sep 30 08:42:06 crc kubenswrapper[4760]: I0930 08:42:06.067237 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w2vs4" Sep 30 08:42:06 crc kubenswrapper[4760]: I0930 08:42:06.252735 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w2vs4"] Sep 30 08:42:07 crc kubenswrapper[4760]: I0930 08:42:07.940596 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w2vs4" podUID="e71b2696-2321-4293-871c-3fa81d4a1094" containerName="registry-server" containerID="cri-o://0cfc37d628990a575f4b7a1e65b7964599732c89f3a1e3191b360504d22ff475" gracePeriod=2 Sep 30 08:42:08 crc kubenswrapper[4760]: I0930 08:42:08.484674 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w2vs4" Sep 30 08:42:08 crc kubenswrapper[4760]: I0930 08:42:08.548892 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e71b2696-2321-4293-871c-3fa81d4a1094-utilities\") pod \"e71b2696-2321-4293-871c-3fa81d4a1094\" (UID: \"e71b2696-2321-4293-871c-3fa81d4a1094\") " Sep 30 08:42:08 crc kubenswrapper[4760]: I0930 08:42:08.548994 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e71b2696-2321-4293-871c-3fa81d4a1094-catalog-content\") pod \"e71b2696-2321-4293-871c-3fa81d4a1094\" (UID: \"e71b2696-2321-4293-871c-3fa81d4a1094\") " Sep 30 08:42:08 crc kubenswrapper[4760]: I0930 08:42:08.549026 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zg9sx\" (UniqueName: \"kubernetes.io/projected/e71b2696-2321-4293-871c-3fa81d4a1094-kube-api-access-zg9sx\") pod \"e71b2696-2321-4293-871c-3fa81d4a1094\" (UID: \"e71b2696-2321-4293-871c-3fa81d4a1094\") " Sep 30 08:42:08 crc kubenswrapper[4760]: I0930 08:42:08.549699 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e71b2696-2321-4293-871c-3fa81d4a1094-utilities" (OuterVolumeSpecName: "utilities") pod "e71b2696-2321-4293-871c-3fa81d4a1094" (UID: "e71b2696-2321-4293-871c-3fa81d4a1094"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:42:08 crc kubenswrapper[4760]: I0930 08:42:08.561565 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e71b2696-2321-4293-871c-3fa81d4a1094-kube-api-access-zg9sx" (OuterVolumeSpecName: "kube-api-access-zg9sx") pod "e71b2696-2321-4293-871c-3fa81d4a1094" (UID: "e71b2696-2321-4293-871c-3fa81d4a1094"). InnerVolumeSpecName "kube-api-access-zg9sx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:42:08 crc kubenswrapper[4760]: I0930 08:42:08.651582 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e71b2696-2321-4293-871c-3fa81d4a1094-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 08:42:08 crc kubenswrapper[4760]: I0930 08:42:08.651815 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zg9sx\" (UniqueName: \"kubernetes.io/projected/e71b2696-2321-4293-871c-3fa81d4a1094-kube-api-access-zg9sx\") on node \"crc\" DevicePath \"\"" Sep 30 08:42:08 crc kubenswrapper[4760]: I0930 08:42:08.660270 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e71b2696-2321-4293-871c-3fa81d4a1094-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e71b2696-2321-4293-871c-3fa81d4a1094" (UID: "e71b2696-2321-4293-871c-3fa81d4a1094"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:42:08 crc kubenswrapper[4760]: I0930 08:42:08.753980 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e71b2696-2321-4293-871c-3fa81d4a1094-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 08:42:08 crc kubenswrapper[4760]: I0930 08:42:08.958011 4760 generic.go:334] "Generic (PLEG): container finished" podID="e71b2696-2321-4293-871c-3fa81d4a1094" containerID="0cfc37d628990a575f4b7a1e65b7964599732c89f3a1e3191b360504d22ff475" exitCode=0 Sep 30 08:42:08 crc kubenswrapper[4760]: I0930 08:42:08.958085 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2vs4" event={"ID":"e71b2696-2321-4293-871c-3fa81d4a1094","Type":"ContainerDied","Data":"0cfc37d628990a575f4b7a1e65b7964599732c89f3a1e3191b360504d22ff475"} Sep 30 08:42:08 crc kubenswrapper[4760]: I0930 08:42:08.958115 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w2vs4" Sep 30 08:42:08 crc kubenswrapper[4760]: I0930 08:42:08.958160 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2vs4" event={"ID":"e71b2696-2321-4293-871c-3fa81d4a1094","Type":"ContainerDied","Data":"5836d75e9d5dc03077c0822c1917ca9bc75fe90b1ae5e42e56207e5fcda72100"} Sep 30 08:42:08 crc kubenswrapper[4760]: I0930 08:42:08.958197 4760 scope.go:117] "RemoveContainer" containerID="0cfc37d628990a575f4b7a1e65b7964599732c89f3a1e3191b360504d22ff475" Sep 30 08:42:09 crc kubenswrapper[4760]: I0930 08:42:09.001837 4760 scope.go:117] "RemoveContainer" containerID="c6341a3689a0f45060680af6859fbfa301853a3f3167563914349f74cefd1b2f" Sep 30 08:42:09 crc kubenswrapper[4760]: I0930 08:42:09.004147 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w2vs4"] Sep 30 08:42:09 crc kubenswrapper[4760]: I0930 08:42:09.012428 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w2vs4"] Sep 30 08:42:09 crc kubenswrapper[4760]: I0930 08:42:09.025288 4760 scope.go:117] "RemoveContainer" containerID="ba3affcb615eeaff4fc70c483d51f2372600577e866a405fe0b8853e2c8bb2d4" Sep 30 08:42:09 crc kubenswrapper[4760]: I0930 08:42:09.078921 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e71b2696-2321-4293-871c-3fa81d4a1094" path="/var/lib/kubelet/pods/e71b2696-2321-4293-871c-3fa81d4a1094/volumes" Sep 30 08:42:09 crc kubenswrapper[4760]: I0930 08:42:09.111156 4760 scope.go:117] "RemoveContainer" containerID="0cfc37d628990a575f4b7a1e65b7964599732c89f3a1e3191b360504d22ff475" Sep 30 08:42:09 crc kubenswrapper[4760]: E0930 08:42:09.111734 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cfc37d628990a575f4b7a1e65b7964599732c89f3a1e3191b360504d22ff475\": container with ID starting with 0cfc37d628990a575f4b7a1e65b7964599732c89f3a1e3191b360504d22ff475 not found: ID does not exist" containerID="0cfc37d628990a575f4b7a1e65b7964599732c89f3a1e3191b360504d22ff475" Sep 30 08:42:09 crc kubenswrapper[4760]: I0930 08:42:09.111769 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cfc37d628990a575f4b7a1e65b7964599732c89f3a1e3191b360504d22ff475"} err="failed to get container status \"0cfc37d628990a575f4b7a1e65b7964599732c89f3a1e3191b360504d22ff475\": rpc error: code = NotFound desc = could not find container \"0cfc37d628990a575f4b7a1e65b7964599732c89f3a1e3191b360504d22ff475\": container with ID starting with 0cfc37d628990a575f4b7a1e65b7964599732c89f3a1e3191b360504d22ff475 not found: ID does not exist" Sep 30 08:42:09 crc kubenswrapper[4760]: I0930 08:42:09.111794 4760 scope.go:117] "RemoveContainer" containerID="c6341a3689a0f45060680af6859fbfa301853a3f3167563914349f74cefd1b2f" Sep 30 08:42:09 crc kubenswrapper[4760]: E0930 08:42:09.112116 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6341a3689a0f45060680af6859fbfa301853a3f3167563914349f74cefd1b2f\": container with ID starting with c6341a3689a0f45060680af6859fbfa301853a3f3167563914349f74cefd1b2f not found: ID does not exist" containerID="c6341a3689a0f45060680af6859fbfa301853a3f3167563914349f74cefd1b2f" Sep 30 08:42:09 crc kubenswrapper[4760]: I0930 08:42:09.112136 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6341a3689a0f45060680af6859fbfa301853a3f3167563914349f74cefd1b2f"} err="failed to get container status \"c6341a3689a0f45060680af6859fbfa301853a3f3167563914349f74cefd1b2f\": rpc error: code = NotFound desc = could not find container \"c6341a3689a0f45060680af6859fbfa301853a3f3167563914349f74cefd1b2f\": container with ID starting with c6341a3689a0f45060680af6859fbfa301853a3f3167563914349f74cefd1b2f not found: ID does not exist" Sep 30 08:42:09 crc kubenswrapper[4760]: I0930 08:42:09.112148 4760 scope.go:117] "RemoveContainer" containerID="ba3affcb615eeaff4fc70c483d51f2372600577e866a405fe0b8853e2c8bb2d4" Sep 30 08:42:09 crc kubenswrapper[4760]: E0930 08:42:09.112548 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba3affcb615eeaff4fc70c483d51f2372600577e866a405fe0b8853e2c8bb2d4\": container with ID starting with ba3affcb615eeaff4fc70c483d51f2372600577e866a405fe0b8853e2c8bb2d4 not found: ID does not exist" containerID="ba3affcb615eeaff4fc70c483d51f2372600577e866a405fe0b8853e2c8bb2d4" Sep 30 08:42:09 crc kubenswrapper[4760]: I0930 08:42:09.112615 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba3affcb615eeaff4fc70c483d51f2372600577e866a405fe0b8853e2c8bb2d4"} err="failed to get container status \"ba3affcb615eeaff4fc70c483d51f2372600577e866a405fe0b8853e2c8bb2d4\": rpc error: code = NotFound desc = could not find container \"ba3affcb615eeaff4fc70c483d51f2372600577e866a405fe0b8853e2c8bb2d4\": container with ID starting with ba3affcb615eeaff4fc70c483d51f2372600577e866a405fe0b8853e2c8bb2d4 not found: ID does not exist" Sep 30 08:43:49 crc kubenswrapper[4760]: I0930 08:43:49.113089 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:43:49 crc kubenswrapper[4760]: I0930 08:43:49.113768 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:44:01 crc kubenswrapper[4760]: E0930 08:44:01.179641 4760 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.201:35940->38.102.83.201:37703: write tcp 38.102.83.201:35940->38.102.83.201:37703: write: broken pipe Sep 30 08:44:12 crc kubenswrapper[4760]: E0930 08:44:12.892595 4760 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.201:55170->38.102.83.201:37703: write tcp 38.102.83.201:55170->38.102.83.201:37703: write: connection reset by peer Sep 30 08:44:19 crc kubenswrapper[4760]: I0930 08:44:19.112893 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:44:19 crc kubenswrapper[4760]: I0930 08:44:19.113683 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:44:49 crc kubenswrapper[4760]: I0930 08:44:49.113501 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:44:49 crc kubenswrapper[4760]: I0930 08:44:49.114200 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:44:49 crc kubenswrapper[4760]: I0930 08:44:49.114266 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 08:44:49 crc kubenswrapper[4760]: I0930 08:44:49.115339 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b"} pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 08:44:49 crc kubenswrapper[4760]: I0930 08:44:49.115438 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" containerID="cri-o://131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" gracePeriod=600 Sep 30 08:44:49 crc kubenswrapper[4760]: E0930 08:44:49.259343 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:44:49 crc kubenswrapper[4760]: I0930 08:44:49.722543 4760 generic.go:334] "Generic (PLEG): container finished" podID="7a9c8270-6964-4886-87d0-227b05b76da4" containerID="131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" exitCode=0 Sep 30 08:44:49 crc kubenswrapper[4760]: I0930 08:44:49.722600 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerDied","Data":"131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b"} Sep 30 08:44:49 crc kubenswrapper[4760]: I0930 08:44:49.722759 4760 scope.go:117] "RemoveContainer" containerID="a39521fe028036755f0c31b8a95a27b7712f3d80af52024671ce6ebf04d62afa" Sep 30 08:44:49 crc kubenswrapper[4760]: I0930 08:44:49.723750 4760 scope.go:117] "RemoveContainer" containerID="131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" Sep 30 08:44:49 crc kubenswrapper[4760]: E0930 08:44:49.724242 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:45:00 crc kubenswrapper[4760]: I0930 08:45:00.067851 4760 scope.go:117] "RemoveContainer" containerID="131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" Sep 30 08:45:00 crc kubenswrapper[4760]: E0930 08:45:00.068957 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:45:00 crc kubenswrapper[4760]: I0930 08:45:00.171423 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320365-54g72"] Sep 30 08:45:00 crc kubenswrapper[4760]: E0930 08:45:00.172068 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e71b2696-2321-4293-871c-3fa81d4a1094" containerName="extract-content" Sep 30 08:45:00 crc kubenswrapper[4760]: I0930 08:45:00.172089 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="e71b2696-2321-4293-871c-3fa81d4a1094" containerName="extract-content" Sep 30 08:45:00 crc kubenswrapper[4760]: E0930 08:45:00.172102 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e71b2696-2321-4293-871c-3fa81d4a1094" containerName="registry-server" Sep 30 08:45:00 crc kubenswrapper[4760]: I0930 08:45:00.172110 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="e71b2696-2321-4293-871c-3fa81d4a1094" containerName="registry-server" Sep 30 08:45:00 crc kubenswrapper[4760]: E0930 08:45:00.172125 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e71b2696-2321-4293-871c-3fa81d4a1094" containerName="extract-utilities" Sep 30 08:45:00 crc kubenswrapper[4760]: I0930 08:45:00.172133 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="e71b2696-2321-4293-871c-3fa81d4a1094" containerName="extract-utilities" Sep 30 08:45:00 crc kubenswrapper[4760]: I0930 08:45:00.172383 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="e71b2696-2321-4293-871c-3fa81d4a1094" containerName="registry-server" Sep 30 08:45:00 crc kubenswrapper[4760]: I0930 08:45:00.173339 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320365-54g72" Sep 30 08:45:00 crc kubenswrapper[4760]: I0930 08:45:00.176803 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 08:45:00 crc kubenswrapper[4760]: I0930 08:45:00.176877 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 08:45:00 crc kubenswrapper[4760]: I0930 08:45:00.182445 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320365-54g72"] Sep 30 08:45:00 crc kubenswrapper[4760]: I0930 08:45:00.314016 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d10c4016-c369-4f44-9dc4-070b8b6b6b28-config-volume\") pod \"collect-profiles-29320365-54g72\" (UID: \"d10c4016-c369-4f44-9dc4-070b8b6b6b28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320365-54g72" Sep 30 08:45:00 crc kubenswrapper[4760]: I0930 08:45:00.314222 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx94l\" (UniqueName: \"kubernetes.io/projected/d10c4016-c369-4f44-9dc4-070b8b6b6b28-kube-api-access-zx94l\") pod \"collect-profiles-29320365-54g72\" (UID: \"d10c4016-c369-4f44-9dc4-070b8b6b6b28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320365-54g72" Sep 30 08:45:00 crc kubenswrapper[4760]: I0930 08:45:00.314486 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d10c4016-c369-4f44-9dc4-070b8b6b6b28-secret-volume\") pod \"collect-profiles-29320365-54g72\" (UID: \"d10c4016-c369-4f44-9dc4-070b8b6b6b28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320365-54g72" Sep 30 08:45:00 crc kubenswrapper[4760]: I0930 08:45:00.415880 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d10c4016-c369-4f44-9dc4-070b8b6b6b28-config-volume\") pod \"collect-profiles-29320365-54g72\" (UID: \"d10c4016-c369-4f44-9dc4-070b8b6b6b28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320365-54g72" Sep 30 08:45:00 crc kubenswrapper[4760]: I0930 08:45:00.416002 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx94l\" (UniqueName: \"kubernetes.io/projected/d10c4016-c369-4f44-9dc4-070b8b6b6b28-kube-api-access-zx94l\") pod \"collect-profiles-29320365-54g72\" (UID: \"d10c4016-c369-4f44-9dc4-070b8b6b6b28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320365-54g72" Sep 30 08:45:00 crc kubenswrapper[4760]: I0930 08:45:00.416046 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d10c4016-c369-4f44-9dc4-070b8b6b6b28-secret-volume\") pod \"collect-profiles-29320365-54g72\" (UID: \"d10c4016-c369-4f44-9dc4-070b8b6b6b28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320365-54g72" Sep 30 08:45:00 crc kubenswrapper[4760]: I0930 08:45:00.417237 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d10c4016-c369-4f44-9dc4-070b8b6b6b28-config-volume\") pod \"collect-profiles-29320365-54g72\" (UID: \"d10c4016-c369-4f44-9dc4-070b8b6b6b28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320365-54g72" Sep 30 08:45:00 crc kubenswrapper[4760]: I0930 08:45:00.424157 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d10c4016-c369-4f44-9dc4-070b8b6b6b28-secret-volume\") pod \"collect-profiles-29320365-54g72\" (UID: \"d10c4016-c369-4f44-9dc4-070b8b6b6b28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320365-54g72" Sep 30 08:45:00 crc kubenswrapper[4760]: I0930 08:45:00.433953 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx94l\" (UniqueName: \"kubernetes.io/projected/d10c4016-c369-4f44-9dc4-070b8b6b6b28-kube-api-access-zx94l\") pod \"collect-profiles-29320365-54g72\" (UID: \"d10c4016-c369-4f44-9dc4-070b8b6b6b28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320365-54g72" Sep 30 08:45:00 crc kubenswrapper[4760]: I0930 08:45:00.505063 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320365-54g72" Sep 30 08:45:00 crc kubenswrapper[4760]: I0930 08:45:00.992704 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320365-54g72"] Sep 30 08:45:01 crc kubenswrapper[4760]: I0930 08:45:01.874548 4760 generic.go:334] "Generic (PLEG): container finished" podID="d10c4016-c369-4f44-9dc4-070b8b6b6b28" containerID="3b4670d4e53fad790aa1b656789e78af12cf871b42f59f2b2dd8bf7a9304ae24" exitCode=0 Sep 30 08:45:01 crc kubenswrapper[4760]: I0930 08:45:01.874774 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320365-54g72" event={"ID":"d10c4016-c369-4f44-9dc4-070b8b6b6b28","Type":"ContainerDied","Data":"3b4670d4e53fad790aa1b656789e78af12cf871b42f59f2b2dd8bf7a9304ae24"} Sep 30 08:45:01 crc kubenswrapper[4760]: I0930 08:45:01.874990 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320365-54g72" event={"ID":"d10c4016-c369-4f44-9dc4-070b8b6b6b28","Type":"ContainerStarted","Data":"aa82e1930b94e5ff6bd797cf19cd9f0b41be83fb7aa06f4029b4bf6b181c5769"} Sep 30 08:45:03 crc kubenswrapper[4760]: I0930 08:45:03.849671 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320365-54g72" Sep 30 08:45:03 crc kubenswrapper[4760]: I0930 08:45:03.899746 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320365-54g72" event={"ID":"d10c4016-c369-4f44-9dc4-070b8b6b6b28","Type":"ContainerDied","Data":"aa82e1930b94e5ff6bd797cf19cd9f0b41be83fb7aa06f4029b4bf6b181c5769"} Sep 30 08:45:03 crc kubenswrapper[4760]: I0930 08:45:03.899788 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa82e1930b94e5ff6bd797cf19cd9f0b41be83fb7aa06f4029b4bf6b181c5769" Sep 30 08:45:03 crc kubenswrapper[4760]: I0930 08:45:03.899846 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320365-54g72" Sep 30 08:45:03 crc kubenswrapper[4760]: I0930 08:45:03.998096 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d10c4016-c369-4f44-9dc4-070b8b6b6b28-config-volume\") pod \"d10c4016-c369-4f44-9dc4-070b8b6b6b28\" (UID: \"d10c4016-c369-4f44-9dc4-070b8b6b6b28\") " Sep 30 08:45:03 crc kubenswrapper[4760]: I0930 08:45:03.998577 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zx94l\" (UniqueName: \"kubernetes.io/projected/d10c4016-c369-4f44-9dc4-070b8b6b6b28-kube-api-access-zx94l\") pod \"d10c4016-c369-4f44-9dc4-070b8b6b6b28\" (UID: \"d10c4016-c369-4f44-9dc4-070b8b6b6b28\") " Sep 30 08:45:03 crc kubenswrapper[4760]: I0930 08:45:03.998703 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d10c4016-c369-4f44-9dc4-070b8b6b6b28-secret-volume\") pod \"d10c4016-c369-4f44-9dc4-070b8b6b6b28\" (UID: \"d10c4016-c369-4f44-9dc4-070b8b6b6b28\") " Sep 30 08:45:03 crc kubenswrapper[4760]: I0930 08:45:03.999351 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d10c4016-c369-4f44-9dc4-070b8b6b6b28-config-volume" (OuterVolumeSpecName: "config-volume") pod "d10c4016-c369-4f44-9dc4-070b8b6b6b28" (UID: "d10c4016-c369-4f44-9dc4-070b8b6b6b28"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 08:45:04 crc kubenswrapper[4760]: I0930 08:45:04.000417 4760 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d10c4016-c369-4f44-9dc4-070b8b6b6b28-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 08:45:04 crc kubenswrapper[4760]: I0930 08:45:04.011617 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d10c4016-c369-4f44-9dc4-070b8b6b6b28-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d10c4016-c369-4f44-9dc4-070b8b6b6b28" (UID: "d10c4016-c369-4f44-9dc4-070b8b6b6b28"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:45:04 crc kubenswrapper[4760]: I0930 08:45:04.012728 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d10c4016-c369-4f44-9dc4-070b8b6b6b28-kube-api-access-zx94l" (OuterVolumeSpecName: "kube-api-access-zx94l") pod "d10c4016-c369-4f44-9dc4-070b8b6b6b28" (UID: "d10c4016-c369-4f44-9dc4-070b8b6b6b28"). InnerVolumeSpecName "kube-api-access-zx94l". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:45:04 crc kubenswrapper[4760]: I0930 08:45:04.102827 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zx94l\" (UniqueName: \"kubernetes.io/projected/d10c4016-c369-4f44-9dc4-070b8b6b6b28-kube-api-access-zx94l\") on node \"crc\" DevicePath \"\"" Sep 30 08:45:04 crc kubenswrapper[4760]: I0930 08:45:04.102892 4760 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d10c4016-c369-4f44-9dc4-070b8b6b6b28-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 08:45:04 crc kubenswrapper[4760]: I0930 08:45:04.971929 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320320-z9zxw"] Sep 30 08:45:04 crc kubenswrapper[4760]: I0930 08:45:04.979528 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320320-z9zxw"] Sep 30 08:45:05 crc kubenswrapper[4760]: I0930 08:45:05.094108 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2516ffe5-a86b-49ad-bb40-2481182ccdef" path="/var/lib/kubelet/pods/2516ffe5-a86b-49ad-bb40-2481182ccdef/volumes" Sep 30 08:45:14 crc kubenswrapper[4760]: I0930 08:45:14.067663 4760 scope.go:117] "RemoveContainer" containerID="131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" Sep 30 08:45:14 crc kubenswrapper[4760]: E0930 08:45:14.068714 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:45:19 crc kubenswrapper[4760]: I0930 08:45:19.924874 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zmsbv"] Sep 30 08:45:19 crc kubenswrapper[4760]: E0930 08:45:19.925872 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d10c4016-c369-4f44-9dc4-070b8b6b6b28" containerName="collect-profiles" Sep 30 08:45:19 crc kubenswrapper[4760]: I0930 08:45:19.925888 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d10c4016-c369-4f44-9dc4-070b8b6b6b28" containerName="collect-profiles" Sep 30 08:45:19 crc kubenswrapper[4760]: I0930 08:45:19.926096 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="d10c4016-c369-4f44-9dc4-070b8b6b6b28" containerName="collect-profiles" Sep 30 08:45:19 crc kubenswrapper[4760]: I0930 08:45:19.927559 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zmsbv" Sep 30 08:45:19 crc kubenswrapper[4760]: I0930 08:45:19.947179 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zmsbv"] Sep 30 08:45:20 crc kubenswrapper[4760]: I0930 08:45:20.090179 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8065abae-3351-4daf-9aff-8bf97affce6a-catalog-content\") pod \"community-operators-zmsbv\" (UID: \"8065abae-3351-4daf-9aff-8bf97affce6a\") " pod="openshift-marketplace/community-operators-zmsbv" Sep 30 08:45:20 crc kubenswrapper[4760]: I0930 08:45:20.090422 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvtrf\" (UniqueName: \"kubernetes.io/projected/8065abae-3351-4daf-9aff-8bf97affce6a-kube-api-access-dvtrf\") pod \"community-operators-zmsbv\" (UID: \"8065abae-3351-4daf-9aff-8bf97affce6a\") " pod="openshift-marketplace/community-operators-zmsbv" Sep 30 08:45:20 crc kubenswrapper[4760]: I0930 08:45:20.090574 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8065abae-3351-4daf-9aff-8bf97affce6a-utilities\") pod \"community-operators-zmsbv\" (UID: \"8065abae-3351-4daf-9aff-8bf97affce6a\") " pod="openshift-marketplace/community-operators-zmsbv" Sep 30 08:45:20 crc kubenswrapper[4760]: I0930 08:45:20.191950 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8065abae-3351-4daf-9aff-8bf97affce6a-catalog-content\") pod \"community-operators-zmsbv\" (UID: \"8065abae-3351-4daf-9aff-8bf97affce6a\") " pod="openshift-marketplace/community-operators-zmsbv" Sep 30 08:45:20 crc kubenswrapper[4760]: I0930 08:45:20.192357 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvtrf\" (UniqueName: \"kubernetes.io/projected/8065abae-3351-4daf-9aff-8bf97affce6a-kube-api-access-dvtrf\") pod \"community-operators-zmsbv\" (UID: \"8065abae-3351-4daf-9aff-8bf97affce6a\") " pod="openshift-marketplace/community-operators-zmsbv" Sep 30 08:45:20 crc kubenswrapper[4760]: I0930 08:45:20.192420 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8065abae-3351-4daf-9aff-8bf97affce6a-utilities\") pod \"community-operators-zmsbv\" (UID: \"8065abae-3351-4daf-9aff-8bf97affce6a\") " pod="openshift-marketplace/community-operators-zmsbv" Sep 30 08:45:20 crc kubenswrapper[4760]: I0930 08:45:20.192803 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8065abae-3351-4daf-9aff-8bf97affce6a-utilities\") pod \"community-operators-zmsbv\" (UID: \"8065abae-3351-4daf-9aff-8bf97affce6a\") " pod="openshift-marketplace/community-operators-zmsbv" Sep 30 08:45:20 crc kubenswrapper[4760]: I0930 08:45:20.193121 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8065abae-3351-4daf-9aff-8bf97affce6a-catalog-content\") pod \"community-operators-zmsbv\" (UID: \"8065abae-3351-4daf-9aff-8bf97affce6a\") " pod="openshift-marketplace/community-operators-zmsbv" Sep 30 08:45:20 crc kubenswrapper[4760]: I0930 08:45:20.214833 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvtrf\" (UniqueName: \"kubernetes.io/projected/8065abae-3351-4daf-9aff-8bf97affce6a-kube-api-access-dvtrf\") pod \"community-operators-zmsbv\" (UID: \"8065abae-3351-4daf-9aff-8bf97affce6a\") " pod="openshift-marketplace/community-operators-zmsbv" Sep 30 08:45:20 crc kubenswrapper[4760]: I0930 08:45:20.260517 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zmsbv" Sep 30 08:45:20 crc kubenswrapper[4760]: I0930 08:45:20.810569 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zmsbv"] Sep 30 08:45:21 crc kubenswrapper[4760]: I0930 08:45:21.099924 4760 generic.go:334] "Generic (PLEG): container finished" podID="8065abae-3351-4daf-9aff-8bf97affce6a" containerID="8b7391f02d620c4c3352df7847f11874528ca0cffcac4104cb932e636888d97b" exitCode=0 Sep 30 08:45:21 crc kubenswrapper[4760]: I0930 08:45:21.100050 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zmsbv" event={"ID":"8065abae-3351-4daf-9aff-8bf97affce6a","Type":"ContainerDied","Data":"8b7391f02d620c4c3352df7847f11874528ca0cffcac4104cb932e636888d97b"} Sep 30 08:45:21 crc kubenswrapper[4760]: I0930 08:45:21.101925 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zmsbv" event={"ID":"8065abae-3351-4daf-9aff-8bf97affce6a","Type":"ContainerStarted","Data":"2d20e2450a2b343f69366df5068e8635f209da2255f5957eb38061aa353e2346"} Sep 30 08:45:23 crc kubenswrapper[4760]: I0930 08:45:23.297490 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8dz44"] Sep 30 08:45:23 crc kubenswrapper[4760]: I0930 08:45:23.302008 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8dz44" Sep 30 08:45:23 crc kubenswrapper[4760]: I0930 08:45:23.312883 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8dz44"] Sep 30 08:45:23 crc kubenswrapper[4760]: I0930 08:45:23.462812 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/478ced0c-4e4b-41d9-811a-69ad798a6d7c-utilities\") pod \"certified-operators-8dz44\" (UID: \"478ced0c-4e4b-41d9-811a-69ad798a6d7c\") " pod="openshift-marketplace/certified-operators-8dz44" Sep 30 08:45:23 crc kubenswrapper[4760]: I0930 08:45:23.463017 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/478ced0c-4e4b-41d9-811a-69ad798a6d7c-catalog-content\") pod \"certified-operators-8dz44\" (UID: \"478ced0c-4e4b-41d9-811a-69ad798a6d7c\") " pod="openshift-marketplace/certified-operators-8dz44" Sep 30 08:45:23 crc kubenswrapper[4760]: I0930 08:45:23.463053 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qchkb\" (UniqueName: \"kubernetes.io/projected/478ced0c-4e4b-41d9-811a-69ad798a6d7c-kube-api-access-qchkb\") pod \"certified-operators-8dz44\" (UID: \"478ced0c-4e4b-41d9-811a-69ad798a6d7c\") " pod="openshift-marketplace/certified-operators-8dz44" Sep 30 08:45:23 crc kubenswrapper[4760]: I0930 08:45:23.565508 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qchkb\" (UniqueName: \"kubernetes.io/projected/478ced0c-4e4b-41d9-811a-69ad798a6d7c-kube-api-access-qchkb\") pod \"certified-operators-8dz44\" (UID: \"478ced0c-4e4b-41d9-811a-69ad798a6d7c\") " pod="openshift-marketplace/certified-operators-8dz44" Sep 30 08:45:23 crc kubenswrapper[4760]: I0930 08:45:23.565674 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/478ced0c-4e4b-41d9-811a-69ad798a6d7c-utilities\") pod \"certified-operators-8dz44\" (UID: \"478ced0c-4e4b-41d9-811a-69ad798a6d7c\") " pod="openshift-marketplace/certified-operators-8dz44" Sep 30 08:45:23 crc kubenswrapper[4760]: I0930 08:45:23.567864 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/478ced0c-4e4b-41d9-811a-69ad798a6d7c-catalog-content\") pod \"certified-operators-8dz44\" (UID: \"478ced0c-4e4b-41d9-811a-69ad798a6d7c\") " pod="openshift-marketplace/certified-operators-8dz44" Sep 30 08:45:23 crc kubenswrapper[4760]: I0930 08:45:23.568592 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/478ced0c-4e4b-41d9-811a-69ad798a6d7c-utilities\") pod \"certified-operators-8dz44\" (UID: \"478ced0c-4e4b-41d9-811a-69ad798a6d7c\") " pod="openshift-marketplace/certified-operators-8dz44" Sep 30 08:45:23 crc kubenswrapper[4760]: I0930 08:45:23.568670 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/478ced0c-4e4b-41d9-811a-69ad798a6d7c-catalog-content\") pod \"certified-operators-8dz44\" (UID: \"478ced0c-4e4b-41d9-811a-69ad798a6d7c\") " pod="openshift-marketplace/certified-operators-8dz44" Sep 30 08:45:23 crc kubenswrapper[4760]: I0930 08:45:23.592413 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qchkb\" (UniqueName: \"kubernetes.io/projected/478ced0c-4e4b-41d9-811a-69ad798a6d7c-kube-api-access-qchkb\") pod \"certified-operators-8dz44\" (UID: \"478ced0c-4e4b-41d9-811a-69ad798a6d7c\") " pod="openshift-marketplace/certified-operators-8dz44" Sep 30 08:45:23 crc kubenswrapper[4760]: I0930 08:45:23.624859 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8dz44" Sep 30 08:45:25 crc kubenswrapper[4760]: I0930 08:45:25.476232 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8dz44"] Sep 30 08:45:26 crc kubenswrapper[4760]: I0930 08:45:26.163670 4760 generic.go:334] "Generic (PLEG): container finished" podID="8065abae-3351-4daf-9aff-8bf97affce6a" containerID="0989cf3b4049074767c0b6f00d9516ce44e30b3b9895bfee58242b2e87b1cdce" exitCode=0 Sep 30 08:45:26 crc kubenswrapper[4760]: I0930 08:45:26.163751 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zmsbv" event={"ID":"8065abae-3351-4daf-9aff-8bf97affce6a","Type":"ContainerDied","Data":"0989cf3b4049074767c0b6f00d9516ce44e30b3b9895bfee58242b2e87b1cdce"} Sep 30 08:45:26 crc kubenswrapper[4760]: I0930 08:45:26.165406 4760 generic.go:334] "Generic (PLEG): container finished" podID="478ced0c-4e4b-41d9-811a-69ad798a6d7c" containerID="a6c6266b2adbd91b41fe493fa3a4f508a3505c8e389ff6f306fd7a0e21b4456c" exitCode=0 Sep 30 08:45:26 crc kubenswrapper[4760]: I0930 08:45:26.165447 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8dz44" event={"ID":"478ced0c-4e4b-41d9-811a-69ad798a6d7c","Type":"ContainerDied","Data":"a6c6266b2adbd91b41fe493fa3a4f508a3505c8e389ff6f306fd7a0e21b4456c"} Sep 30 08:45:26 crc kubenswrapper[4760]: I0930 08:45:26.165484 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8dz44" event={"ID":"478ced0c-4e4b-41d9-811a-69ad798a6d7c","Type":"ContainerStarted","Data":"58bc024a4875ed9d3db4f92589302e0457879af016e76c8690746f8ee74613a8"} Sep 30 08:45:27 crc kubenswrapper[4760]: I0930 08:45:27.187468 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zmsbv" event={"ID":"8065abae-3351-4daf-9aff-8bf97affce6a","Type":"ContainerStarted","Data":"a8fc4248a49e404ec617a313413fba95062ebaf46b5cc4b00517eb6ded4ce5bc"} Sep 30 08:45:27 crc kubenswrapper[4760]: I0930 08:45:27.206483 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8dz44" event={"ID":"478ced0c-4e4b-41d9-811a-69ad798a6d7c","Type":"ContainerStarted","Data":"396c8fea700f3c427de5bbd75960277398daec87ab38d9b491ae0847e840a676"} Sep 30 08:45:27 crc kubenswrapper[4760]: I0930 08:45:27.222716 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zmsbv" podStartSLOduration=2.69342149 podStartE2EDuration="8.222694283s" podCreationTimestamp="2025-09-30 08:45:19 +0000 UTC" firstStartedPulling="2025-09-30 08:45:21.101954134 +0000 UTC m=+4306.744860546" lastFinishedPulling="2025-09-30 08:45:26.631226917 +0000 UTC m=+4312.274133339" observedRunningTime="2025-09-30 08:45:27.216140086 +0000 UTC m=+4312.859046508" watchObservedRunningTime="2025-09-30 08:45:27.222694283 +0000 UTC m=+4312.865600695" Sep 30 08:45:29 crc kubenswrapper[4760]: I0930 08:45:29.067642 4760 scope.go:117] "RemoveContainer" containerID="131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" Sep 30 08:45:29 crc kubenswrapper[4760]: E0930 08:45:29.068585 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:45:30 crc kubenswrapper[4760]: I0930 08:45:30.243514 4760 generic.go:334] "Generic (PLEG): container finished" podID="478ced0c-4e4b-41d9-811a-69ad798a6d7c" containerID="396c8fea700f3c427de5bbd75960277398daec87ab38d9b491ae0847e840a676" exitCode=0 Sep 30 08:45:30 crc kubenswrapper[4760]: I0930 08:45:30.243622 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8dz44" event={"ID":"478ced0c-4e4b-41d9-811a-69ad798a6d7c","Type":"ContainerDied","Data":"396c8fea700f3c427de5bbd75960277398daec87ab38d9b491ae0847e840a676"} Sep 30 08:45:30 crc kubenswrapper[4760]: I0930 08:45:30.260910 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zmsbv" Sep 30 08:45:30 crc kubenswrapper[4760]: I0930 08:45:30.260957 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zmsbv" Sep 30 08:45:30 crc kubenswrapper[4760]: I0930 08:45:30.327723 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zmsbv" Sep 30 08:45:31 crc kubenswrapper[4760]: I0930 08:45:31.255498 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8dz44" event={"ID":"478ced0c-4e4b-41d9-811a-69ad798a6d7c","Type":"ContainerStarted","Data":"f9167d70e472b9bb0a5da4006a8b48cc914a74ca4ad1d97aab0c3c42aa03b209"} Sep 30 08:45:31 crc kubenswrapper[4760]: I0930 08:45:31.287120 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8dz44" podStartSLOduration=3.646865996 podStartE2EDuration="8.287101711s" podCreationTimestamp="2025-09-30 08:45:23 +0000 UTC" firstStartedPulling="2025-09-30 08:45:26.167658269 +0000 UTC m=+4311.810564681" lastFinishedPulling="2025-09-30 08:45:30.807893974 +0000 UTC m=+4316.450800396" observedRunningTime="2025-09-30 08:45:31.273993677 +0000 UTC m=+4316.916900169" watchObservedRunningTime="2025-09-30 08:45:31.287101711 +0000 UTC m=+4316.930008123" Sep 30 08:45:31 crc kubenswrapper[4760]: I0930 08:45:31.330118 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zmsbv" Sep 30 08:45:32 crc kubenswrapper[4760]: I0930 08:45:32.329107 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zmsbv"] Sep 30 08:45:32 crc kubenswrapper[4760]: I0930 08:45:32.690244 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-59dbp"] Sep 30 08:45:32 crc kubenswrapper[4760]: I0930 08:45:32.690537 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-59dbp" podUID="313bc92e-49cd-492e-939a-af5547c47e72" containerName="registry-server" containerID="cri-o://ee46946f94dd863d11ab2ba0ff9e647a4b7c63ae2415f7da4650bce5840d72d5" gracePeriod=2 Sep 30 08:45:33 crc kubenswrapper[4760]: I0930 08:45:33.282646 4760 generic.go:334] "Generic (PLEG): container finished" podID="313bc92e-49cd-492e-939a-af5547c47e72" containerID="ee46946f94dd863d11ab2ba0ff9e647a4b7c63ae2415f7da4650bce5840d72d5" exitCode=0 Sep 30 08:45:33 crc kubenswrapper[4760]: I0930 08:45:33.282739 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-59dbp" event={"ID":"313bc92e-49cd-492e-939a-af5547c47e72","Type":"ContainerDied","Data":"ee46946f94dd863d11ab2ba0ff9e647a4b7c63ae2415f7da4650bce5840d72d5"} Sep 30 08:45:33 crc kubenswrapper[4760]: I0930 08:45:33.283286 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-59dbp" event={"ID":"313bc92e-49cd-492e-939a-af5547c47e72","Type":"ContainerDied","Data":"4af6057e7b37ce4966212a35993fab1fbb33388acbb3e536ed3623a0636c6354"} Sep 30 08:45:33 crc kubenswrapper[4760]: I0930 08:45:33.283325 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4af6057e7b37ce4966212a35993fab1fbb33388acbb3e536ed3623a0636c6354" Sep 30 08:45:33 crc kubenswrapper[4760]: I0930 08:45:33.292793 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-59dbp" Sep 30 08:45:33 crc kubenswrapper[4760]: I0930 08:45:33.398927 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/313bc92e-49cd-492e-939a-af5547c47e72-utilities\") pod \"313bc92e-49cd-492e-939a-af5547c47e72\" (UID: \"313bc92e-49cd-492e-939a-af5547c47e72\") " Sep 30 08:45:33 crc kubenswrapper[4760]: I0930 08:45:33.399096 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh2gl\" (UniqueName: \"kubernetes.io/projected/313bc92e-49cd-492e-939a-af5547c47e72-kube-api-access-qh2gl\") pod \"313bc92e-49cd-492e-939a-af5547c47e72\" (UID: \"313bc92e-49cd-492e-939a-af5547c47e72\") " Sep 30 08:45:33 crc kubenswrapper[4760]: I0930 08:45:33.399202 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/313bc92e-49cd-492e-939a-af5547c47e72-catalog-content\") pod \"313bc92e-49cd-492e-939a-af5547c47e72\" (UID: \"313bc92e-49cd-492e-939a-af5547c47e72\") " Sep 30 08:45:33 crc kubenswrapper[4760]: I0930 08:45:33.400137 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/313bc92e-49cd-492e-939a-af5547c47e72-utilities" (OuterVolumeSpecName: "utilities") pod "313bc92e-49cd-492e-939a-af5547c47e72" (UID: "313bc92e-49cd-492e-939a-af5547c47e72"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:45:33 crc kubenswrapper[4760]: I0930 08:45:33.410969 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/313bc92e-49cd-492e-939a-af5547c47e72-kube-api-access-qh2gl" (OuterVolumeSpecName: "kube-api-access-qh2gl") pod "313bc92e-49cd-492e-939a-af5547c47e72" (UID: "313bc92e-49cd-492e-939a-af5547c47e72"). InnerVolumeSpecName "kube-api-access-qh2gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:45:33 crc kubenswrapper[4760]: I0930 08:45:33.449359 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/313bc92e-49cd-492e-939a-af5547c47e72-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "313bc92e-49cd-492e-939a-af5547c47e72" (UID: "313bc92e-49cd-492e-939a-af5547c47e72"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:45:33 crc kubenswrapper[4760]: I0930 08:45:33.501398 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/313bc92e-49cd-492e-939a-af5547c47e72-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 08:45:33 crc kubenswrapper[4760]: I0930 08:45:33.501598 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh2gl\" (UniqueName: \"kubernetes.io/projected/313bc92e-49cd-492e-939a-af5547c47e72-kube-api-access-qh2gl\") on node \"crc\" DevicePath \"\"" Sep 30 08:45:33 crc kubenswrapper[4760]: I0930 08:45:33.501693 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/313bc92e-49cd-492e-939a-af5547c47e72-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 08:45:33 crc kubenswrapper[4760]: I0930 08:45:33.626017 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8dz44" Sep 30 08:45:33 crc kubenswrapper[4760]: I0930 08:45:33.626092 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8dz44" Sep 30 08:45:33 crc kubenswrapper[4760]: I0930 08:45:33.718553 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8dz44" Sep 30 08:45:34 crc kubenswrapper[4760]: I0930 08:45:34.297067 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-59dbp" Sep 30 08:45:34 crc kubenswrapper[4760]: I0930 08:45:34.363932 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-59dbp"] Sep 30 08:45:34 crc kubenswrapper[4760]: I0930 08:45:34.386945 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-59dbp"] Sep 30 08:45:35 crc kubenswrapper[4760]: I0930 08:45:35.109724 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="313bc92e-49cd-492e-939a-af5547c47e72" path="/var/lib/kubelet/pods/313bc92e-49cd-492e-939a-af5547c47e72/volumes" Sep 30 08:45:43 crc kubenswrapper[4760]: I0930 08:45:43.068358 4760 scope.go:117] "RemoveContainer" containerID="131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" Sep 30 08:45:43 crc kubenswrapper[4760]: E0930 08:45:43.070614 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:45:43 crc kubenswrapper[4760]: I0930 08:45:43.688206 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8dz44" Sep 30 08:45:43 crc kubenswrapper[4760]: I0930 08:45:43.737409 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8dz44"] Sep 30 08:45:44 crc kubenswrapper[4760]: I0930 08:45:44.402456 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8dz44" podUID="478ced0c-4e4b-41d9-811a-69ad798a6d7c" containerName="registry-server" containerID="cri-o://f9167d70e472b9bb0a5da4006a8b48cc914a74ca4ad1d97aab0c3c42aa03b209" gracePeriod=2 Sep 30 08:45:45 crc kubenswrapper[4760]: I0930 08:45:45.348589 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8dz44" Sep 30 08:45:45 crc kubenswrapper[4760]: I0930 08:45:45.413251 4760 generic.go:334] "Generic (PLEG): container finished" podID="478ced0c-4e4b-41d9-811a-69ad798a6d7c" containerID="f9167d70e472b9bb0a5da4006a8b48cc914a74ca4ad1d97aab0c3c42aa03b209" exitCode=0 Sep 30 08:45:45 crc kubenswrapper[4760]: I0930 08:45:45.413317 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8dz44" event={"ID":"478ced0c-4e4b-41d9-811a-69ad798a6d7c","Type":"ContainerDied","Data":"f9167d70e472b9bb0a5da4006a8b48cc914a74ca4ad1d97aab0c3c42aa03b209"} Sep 30 08:45:45 crc kubenswrapper[4760]: I0930 08:45:45.413358 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8dz44" event={"ID":"478ced0c-4e4b-41d9-811a-69ad798a6d7c","Type":"ContainerDied","Data":"58bc024a4875ed9d3db4f92589302e0457879af016e76c8690746f8ee74613a8"} Sep 30 08:45:45 crc kubenswrapper[4760]: I0930 08:45:45.413379 4760 scope.go:117] "RemoveContainer" containerID="f9167d70e472b9bb0a5da4006a8b48cc914a74ca4ad1d97aab0c3c42aa03b209" Sep 30 08:45:45 crc kubenswrapper[4760]: I0930 08:45:45.413394 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8dz44" Sep 30 08:45:45 crc kubenswrapper[4760]: I0930 08:45:45.437077 4760 scope.go:117] "RemoveContainer" containerID="396c8fea700f3c427de5bbd75960277398daec87ab38d9b491ae0847e840a676" Sep 30 08:45:45 crc kubenswrapper[4760]: I0930 08:45:45.454379 4760 scope.go:117] "RemoveContainer" containerID="a6c6266b2adbd91b41fe493fa3a4f508a3505c8e389ff6f306fd7a0e21b4456c" Sep 30 08:45:45 crc kubenswrapper[4760]: I0930 08:45:45.460946 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/478ced0c-4e4b-41d9-811a-69ad798a6d7c-utilities\") pod \"478ced0c-4e4b-41d9-811a-69ad798a6d7c\" (UID: \"478ced0c-4e4b-41d9-811a-69ad798a6d7c\") " Sep 30 08:45:45 crc kubenswrapper[4760]: I0930 08:45:45.461075 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/478ced0c-4e4b-41d9-811a-69ad798a6d7c-catalog-content\") pod \"478ced0c-4e4b-41d9-811a-69ad798a6d7c\" (UID: \"478ced0c-4e4b-41d9-811a-69ad798a6d7c\") " Sep 30 08:45:45 crc kubenswrapper[4760]: I0930 08:45:45.461143 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qchkb\" (UniqueName: \"kubernetes.io/projected/478ced0c-4e4b-41d9-811a-69ad798a6d7c-kube-api-access-qchkb\") pod \"478ced0c-4e4b-41d9-811a-69ad798a6d7c\" (UID: \"478ced0c-4e4b-41d9-811a-69ad798a6d7c\") " Sep 30 08:45:45 crc kubenswrapper[4760]: I0930 08:45:45.461943 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/478ced0c-4e4b-41d9-811a-69ad798a6d7c-utilities" (OuterVolumeSpecName: "utilities") pod "478ced0c-4e4b-41d9-811a-69ad798a6d7c" (UID: "478ced0c-4e4b-41d9-811a-69ad798a6d7c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:45:45 crc kubenswrapper[4760]: I0930 08:45:45.462447 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/478ced0c-4e4b-41d9-811a-69ad798a6d7c-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 08:45:45 crc kubenswrapper[4760]: I0930 08:45:45.467238 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/478ced0c-4e4b-41d9-811a-69ad798a6d7c-kube-api-access-qchkb" (OuterVolumeSpecName: "kube-api-access-qchkb") pod "478ced0c-4e4b-41d9-811a-69ad798a6d7c" (UID: "478ced0c-4e4b-41d9-811a-69ad798a6d7c"). InnerVolumeSpecName "kube-api-access-qchkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:45:45 crc kubenswrapper[4760]: I0930 08:45:45.536763 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/478ced0c-4e4b-41d9-811a-69ad798a6d7c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "478ced0c-4e4b-41d9-811a-69ad798a6d7c" (UID: "478ced0c-4e4b-41d9-811a-69ad798a6d7c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:45:45 crc kubenswrapper[4760]: I0930 08:45:45.564051 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/478ced0c-4e4b-41d9-811a-69ad798a6d7c-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 08:45:45 crc kubenswrapper[4760]: I0930 08:45:45.564094 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qchkb\" (UniqueName: \"kubernetes.io/projected/478ced0c-4e4b-41d9-811a-69ad798a6d7c-kube-api-access-qchkb\") on node \"crc\" DevicePath \"\"" Sep 30 08:45:45 crc kubenswrapper[4760]: I0930 08:45:45.566952 4760 scope.go:117] "RemoveContainer" containerID="f9167d70e472b9bb0a5da4006a8b48cc914a74ca4ad1d97aab0c3c42aa03b209" Sep 30 08:45:45 crc kubenswrapper[4760]: E0930 08:45:45.567372 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9167d70e472b9bb0a5da4006a8b48cc914a74ca4ad1d97aab0c3c42aa03b209\": container with ID starting with f9167d70e472b9bb0a5da4006a8b48cc914a74ca4ad1d97aab0c3c42aa03b209 not found: ID does not exist" containerID="f9167d70e472b9bb0a5da4006a8b48cc914a74ca4ad1d97aab0c3c42aa03b209" Sep 30 08:45:45 crc kubenswrapper[4760]: I0930 08:45:45.567416 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9167d70e472b9bb0a5da4006a8b48cc914a74ca4ad1d97aab0c3c42aa03b209"} err="failed to get container status \"f9167d70e472b9bb0a5da4006a8b48cc914a74ca4ad1d97aab0c3c42aa03b209\": rpc error: code = NotFound desc = could not find container \"f9167d70e472b9bb0a5da4006a8b48cc914a74ca4ad1d97aab0c3c42aa03b209\": container with ID starting with f9167d70e472b9bb0a5da4006a8b48cc914a74ca4ad1d97aab0c3c42aa03b209 not found: ID does not exist" Sep 30 08:45:45 crc kubenswrapper[4760]: I0930 08:45:45.567442 4760 scope.go:117] "RemoveContainer" containerID="396c8fea700f3c427de5bbd75960277398daec87ab38d9b491ae0847e840a676" Sep 30 08:45:45 crc kubenswrapper[4760]: E0930 08:45:45.567643 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"396c8fea700f3c427de5bbd75960277398daec87ab38d9b491ae0847e840a676\": container with ID starting with 396c8fea700f3c427de5bbd75960277398daec87ab38d9b491ae0847e840a676 not found: ID does not exist" containerID="396c8fea700f3c427de5bbd75960277398daec87ab38d9b491ae0847e840a676" Sep 30 08:45:45 crc kubenswrapper[4760]: I0930 08:45:45.567663 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"396c8fea700f3c427de5bbd75960277398daec87ab38d9b491ae0847e840a676"} err="failed to get container status \"396c8fea700f3c427de5bbd75960277398daec87ab38d9b491ae0847e840a676\": rpc error: code = NotFound desc = could not find container \"396c8fea700f3c427de5bbd75960277398daec87ab38d9b491ae0847e840a676\": container with ID starting with 396c8fea700f3c427de5bbd75960277398daec87ab38d9b491ae0847e840a676 not found: ID does not exist" Sep 30 08:45:45 crc kubenswrapper[4760]: I0930 08:45:45.567676 4760 scope.go:117] "RemoveContainer" containerID="a6c6266b2adbd91b41fe493fa3a4f508a3505c8e389ff6f306fd7a0e21b4456c" Sep 30 08:45:45 crc kubenswrapper[4760]: E0930 08:45:45.567820 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6c6266b2adbd91b41fe493fa3a4f508a3505c8e389ff6f306fd7a0e21b4456c\": container with ID starting with a6c6266b2adbd91b41fe493fa3a4f508a3505c8e389ff6f306fd7a0e21b4456c not found: ID does not exist" containerID="a6c6266b2adbd91b41fe493fa3a4f508a3505c8e389ff6f306fd7a0e21b4456c" Sep 30 08:45:45 crc kubenswrapper[4760]: I0930 08:45:45.567837 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6c6266b2adbd91b41fe493fa3a4f508a3505c8e389ff6f306fd7a0e21b4456c"} err="failed to get container status \"a6c6266b2adbd91b41fe493fa3a4f508a3505c8e389ff6f306fd7a0e21b4456c\": rpc error: code = NotFound desc = could not find container \"a6c6266b2adbd91b41fe493fa3a4f508a3505c8e389ff6f306fd7a0e21b4456c\": container with ID starting with a6c6266b2adbd91b41fe493fa3a4f508a3505c8e389ff6f306fd7a0e21b4456c not found: ID does not exist" Sep 30 08:45:45 crc kubenswrapper[4760]: I0930 08:45:45.751328 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8dz44"] Sep 30 08:45:45 crc kubenswrapper[4760]: I0930 08:45:45.762146 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8dz44"] Sep 30 08:45:47 crc kubenswrapper[4760]: I0930 08:45:47.079222 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="478ced0c-4e4b-41d9-811a-69ad798a6d7c" path="/var/lib/kubelet/pods/478ced0c-4e4b-41d9-811a-69ad798a6d7c/volumes" Sep 30 08:45:47 crc kubenswrapper[4760]: I0930 08:45:47.105070 4760 scope.go:117] "RemoveContainer" containerID="ee46946f94dd863d11ab2ba0ff9e647a4b7c63ae2415f7da4650bce5840d72d5" Sep 30 08:45:47 crc kubenswrapper[4760]: I0930 08:45:47.138574 4760 scope.go:117] "RemoveContainer" containerID="a4a0a5a865e904ced8f2ccdaed35358509fb83096ba4b2d529fe9267361ca84c" Sep 30 08:45:47 crc kubenswrapper[4760]: I0930 08:45:47.173569 4760 scope.go:117] "RemoveContainer" containerID="9871396eaf08403a099081a357f0f57fdb69af97b635a87c5fd9ba0b47d0a542" Sep 30 08:45:47 crc kubenswrapper[4760]: I0930 08:45:47.225354 4760 scope.go:117] "RemoveContainer" containerID="8b41c3bb402c4574beb6208209dd8060dc00323c8216eaf9f9a62f90d834b9f0" Sep 30 08:45:53 crc kubenswrapper[4760]: I0930 08:45:53.354790 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-6cc97c56c5-7pkjn" podUID="fc788440-e748-4b41-bdb6-23a6764062fd" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Sep 30 08:45:58 crc kubenswrapper[4760]: I0930 08:45:58.067046 4760 scope.go:117] "RemoveContainer" containerID="131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" Sep 30 08:45:58 crc kubenswrapper[4760]: E0930 08:45:58.067928 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:46:11 crc kubenswrapper[4760]: I0930 08:46:11.070974 4760 scope.go:117] "RemoveContainer" containerID="131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" Sep 30 08:46:11 crc kubenswrapper[4760]: E0930 08:46:11.072039 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:46:24 crc kubenswrapper[4760]: I0930 08:46:24.067441 4760 scope.go:117] "RemoveContainer" containerID="131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" Sep 30 08:46:24 crc kubenswrapper[4760]: E0930 08:46:24.068585 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:46:35 crc kubenswrapper[4760]: I0930 08:46:35.079495 4760 scope.go:117] "RemoveContainer" containerID="131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" Sep 30 08:46:35 crc kubenswrapper[4760]: E0930 08:46:35.081436 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:46:48 crc kubenswrapper[4760]: I0930 08:46:48.066762 4760 scope.go:117] "RemoveContainer" containerID="131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" Sep 30 08:46:48 crc kubenswrapper[4760]: E0930 08:46:48.067568 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:46:59 crc kubenswrapper[4760]: I0930 08:46:59.067576 4760 scope.go:117] "RemoveContainer" containerID="131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" Sep 30 08:46:59 crc kubenswrapper[4760]: E0930 08:46:59.068477 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:47:11 crc kubenswrapper[4760]: I0930 08:47:11.067407 4760 scope.go:117] "RemoveContainer" containerID="131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" Sep 30 08:47:11 crc kubenswrapper[4760]: E0930 08:47:11.068531 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:47:24 crc kubenswrapper[4760]: I0930 08:47:24.067571 4760 scope.go:117] "RemoveContainer" containerID="131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" Sep 30 08:47:24 crc kubenswrapper[4760]: E0930 08:47:24.071694 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:47:35 crc kubenswrapper[4760]: I0930 08:47:35.874806 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tjgq8"] Sep 30 08:47:35 crc kubenswrapper[4760]: E0930 08:47:35.875960 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="478ced0c-4e4b-41d9-811a-69ad798a6d7c" containerName="extract-content" Sep 30 08:47:35 crc kubenswrapper[4760]: I0930 08:47:35.875979 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="478ced0c-4e4b-41d9-811a-69ad798a6d7c" containerName="extract-content" Sep 30 08:47:35 crc kubenswrapper[4760]: E0930 08:47:35.875998 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="478ced0c-4e4b-41d9-811a-69ad798a6d7c" containerName="extract-utilities" Sep 30 08:47:35 crc kubenswrapper[4760]: I0930 08:47:35.876007 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="478ced0c-4e4b-41d9-811a-69ad798a6d7c" containerName="extract-utilities" Sep 30 08:47:35 crc kubenswrapper[4760]: E0930 08:47:35.876025 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="313bc92e-49cd-492e-939a-af5547c47e72" containerName="extract-content" Sep 30 08:47:35 crc kubenswrapper[4760]: I0930 08:47:35.876033 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="313bc92e-49cd-492e-939a-af5547c47e72" containerName="extract-content" Sep 30 08:47:35 crc kubenswrapper[4760]: E0930 08:47:35.876062 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="313bc92e-49cd-492e-939a-af5547c47e72" containerName="registry-server" Sep 30 08:47:35 crc kubenswrapper[4760]: I0930 08:47:35.876070 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="313bc92e-49cd-492e-939a-af5547c47e72" containerName="registry-server" Sep 30 08:47:35 crc kubenswrapper[4760]: E0930 08:47:35.876085 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="478ced0c-4e4b-41d9-811a-69ad798a6d7c" containerName="registry-server" Sep 30 08:47:35 crc kubenswrapper[4760]: I0930 08:47:35.876101 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="478ced0c-4e4b-41d9-811a-69ad798a6d7c" containerName="registry-server" Sep 30 08:47:35 crc kubenswrapper[4760]: E0930 08:47:35.876135 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="313bc92e-49cd-492e-939a-af5547c47e72" containerName="extract-utilities" Sep 30 08:47:35 crc kubenswrapper[4760]: I0930 08:47:35.876146 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="313bc92e-49cd-492e-939a-af5547c47e72" containerName="extract-utilities" Sep 30 08:47:35 crc kubenswrapper[4760]: I0930 08:47:35.876470 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="478ced0c-4e4b-41d9-811a-69ad798a6d7c" containerName="registry-server" Sep 30 08:47:35 crc kubenswrapper[4760]: I0930 08:47:35.876517 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="313bc92e-49cd-492e-939a-af5547c47e72" containerName="registry-server" Sep 30 08:47:35 crc kubenswrapper[4760]: I0930 08:47:35.879355 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tjgq8" Sep 30 08:47:35 crc kubenswrapper[4760]: I0930 08:47:35.892011 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjgq8"] Sep 30 08:47:36 crc kubenswrapper[4760]: I0930 08:47:36.031798 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfc17a09-a2e4-415d-800f-b1ccbae0b470-catalog-content\") pod \"redhat-marketplace-tjgq8\" (UID: \"dfc17a09-a2e4-415d-800f-b1ccbae0b470\") " pod="openshift-marketplace/redhat-marketplace-tjgq8" Sep 30 08:47:36 crc kubenswrapper[4760]: I0930 08:47:36.031921 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfc17a09-a2e4-415d-800f-b1ccbae0b470-utilities\") pod \"redhat-marketplace-tjgq8\" (UID: \"dfc17a09-a2e4-415d-800f-b1ccbae0b470\") " pod="openshift-marketplace/redhat-marketplace-tjgq8" Sep 30 08:47:36 crc kubenswrapper[4760]: I0930 08:47:36.032022 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjvjr\" (UniqueName: \"kubernetes.io/projected/dfc17a09-a2e4-415d-800f-b1ccbae0b470-kube-api-access-sjvjr\") pod \"redhat-marketplace-tjgq8\" (UID: \"dfc17a09-a2e4-415d-800f-b1ccbae0b470\") " pod="openshift-marketplace/redhat-marketplace-tjgq8" Sep 30 08:47:36 crc kubenswrapper[4760]: I0930 08:47:36.067353 4760 scope.go:117] "RemoveContainer" containerID="131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" Sep 30 08:47:36 crc kubenswrapper[4760]: E0930 08:47:36.067688 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:47:36 crc kubenswrapper[4760]: I0930 08:47:36.135997 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfc17a09-a2e4-415d-800f-b1ccbae0b470-catalog-content\") pod \"redhat-marketplace-tjgq8\" (UID: \"dfc17a09-a2e4-415d-800f-b1ccbae0b470\") " pod="openshift-marketplace/redhat-marketplace-tjgq8" Sep 30 08:47:36 crc kubenswrapper[4760]: I0930 08:47:36.136113 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfc17a09-a2e4-415d-800f-b1ccbae0b470-utilities\") pod \"redhat-marketplace-tjgq8\" (UID: \"dfc17a09-a2e4-415d-800f-b1ccbae0b470\") " pod="openshift-marketplace/redhat-marketplace-tjgq8" Sep 30 08:47:36 crc kubenswrapper[4760]: I0930 08:47:36.136236 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjvjr\" (UniqueName: \"kubernetes.io/projected/dfc17a09-a2e4-415d-800f-b1ccbae0b470-kube-api-access-sjvjr\") pod \"redhat-marketplace-tjgq8\" (UID: \"dfc17a09-a2e4-415d-800f-b1ccbae0b470\") " pod="openshift-marketplace/redhat-marketplace-tjgq8" Sep 30 08:47:36 crc kubenswrapper[4760]: I0930 08:47:36.136661 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfc17a09-a2e4-415d-800f-b1ccbae0b470-catalog-content\") pod \"redhat-marketplace-tjgq8\" (UID: \"dfc17a09-a2e4-415d-800f-b1ccbae0b470\") " pod="openshift-marketplace/redhat-marketplace-tjgq8" Sep 30 08:47:36 crc kubenswrapper[4760]: I0930 08:47:36.136888 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfc17a09-a2e4-415d-800f-b1ccbae0b470-utilities\") pod \"redhat-marketplace-tjgq8\" (UID: \"dfc17a09-a2e4-415d-800f-b1ccbae0b470\") " pod="openshift-marketplace/redhat-marketplace-tjgq8" Sep 30 08:47:36 crc kubenswrapper[4760]: I0930 08:47:36.173167 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjvjr\" (UniqueName: \"kubernetes.io/projected/dfc17a09-a2e4-415d-800f-b1ccbae0b470-kube-api-access-sjvjr\") pod \"redhat-marketplace-tjgq8\" (UID: \"dfc17a09-a2e4-415d-800f-b1ccbae0b470\") " pod="openshift-marketplace/redhat-marketplace-tjgq8" Sep 30 08:47:36 crc kubenswrapper[4760]: I0930 08:47:36.252177 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tjgq8" Sep 30 08:47:36 crc kubenswrapper[4760]: I0930 08:47:36.742192 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjgq8"] Sep 30 08:47:37 crc kubenswrapper[4760]: I0930 08:47:37.689874 4760 generic.go:334] "Generic (PLEG): container finished" podID="dfc17a09-a2e4-415d-800f-b1ccbae0b470" containerID="f96472b5c52703b69fcf6a1ac184dabad6eaf042e7e275886fff27826915474d" exitCode=0 Sep 30 08:47:37 crc kubenswrapper[4760]: I0930 08:47:37.689976 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjgq8" event={"ID":"dfc17a09-a2e4-415d-800f-b1ccbae0b470","Type":"ContainerDied","Data":"f96472b5c52703b69fcf6a1ac184dabad6eaf042e7e275886fff27826915474d"} Sep 30 08:47:37 crc kubenswrapper[4760]: I0930 08:47:37.690133 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjgq8" event={"ID":"dfc17a09-a2e4-415d-800f-b1ccbae0b470","Type":"ContainerStarted","Data":"b3d030ecd20d55d34f2e52598346c490666f9cc021d92e233eaed0bb972e0e3e"} Sep 30 08:47:37 crc kubenswrapper[4760]: I0930 08:47:37.692861 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 08:47:38 crc kubenswrapper[4760]: I0930 08:47:38.704561 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjgq8" event={"ID":"dfc17a09-a2e4-415d-800f-b1ccbae0b470","Type":"ContainerStarted","Data":"510afbc758e4e928d36508956678624e9356333096860ea466a589dfefdd09e0"} Sep 30 08:47:39 crc kubenswrapper[4760]: I0930 08:47:39.721481 4760 generic.go:334] "Generic (PLEG): container finished" podID="dfc17a09-a2e4-415d-800f-b1ccbae0b470" containerID="510afbc758e4e928d36508956678624e9356333096860ea466a589dfefdd09e0" exitCode=0 Sep 30 08:47:39 crc kubenswrapper[4760]: I0930 08:47:39.721556 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjgq8" event={"ID":"dfc17a09-a2e4-415d-800f-b1ccbae0b470","Type":"ContainerDied","Data":"510afbc758e4e928d36508956678624e9356333096860ea466a589dfefdd09e0"} Sep 30 08:47:40 crc kubenswrapper[4760]: I0930 08:47:40.736276 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjgq8" event={"ID":"dfc17a09-a2e4-415d-800f-b1ccbae0b470","Type":"ContainerStarted","Data":"840b29d5fea8a75d4658d19a74698d5990d1cd8a0b62d3a08a3b8bf83195cb78"} Sep 30 08:47:40 crc kubenswrapper[4760]: I0930 08:47:40.765051 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tjgq8" podStartSLOduration=3.2517291950000002 podStartE2EDuration="5.765029231s" podCreationTimestamp="2025-09-30 08:47:35 +0000 UTC" firstStartedPulling="2025-09-30 08:47:37.69266992 +0000 UTC m=+4443.335576332" lastFinishedPulling="2025-09-30 08:47:40.205969916 +0000 UTC m=+4445.848876368" observedRunningTime="2025-09-30 08:47:40.757868248 +0000 UTC m=+4446.400774670" watchObservedRunningTime="2025-09-30 08:47:40.765029231 +0000 UTC m=+4446.407935663" Sep 30 08:47:46 crc kubenswrapper[4760]: I0930 08:47:46.253126 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tjgq8" Sep 30 08:47:46 crc kubenswrapper[4760]: I0930 08:47:46.254036 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tjgq8" Sep 30 08:47:46 crc kubenswrapper[4760]: I0930 08:47:46.344176 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tjgq8" Sep 30 08:47:46 crc kubenswrapper[4760]: I0930 08:47:46.876481 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tjgq8" Sep 30 08:47:46 crc kubenswrapper[4760]: I0930 08:47:46.954479 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjgq8"] Sep 30 08:47:48 crc kubenswrapper[4760]: I0930 08:47:48.827988 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tjgq8" podUID="dfc17a09-a2e4-415d-800f-b1ccbae0b470" containerName="registry-server" containerID="cri-o://840b29d5fea8a75d4658d19a74698d5990d1cd8a0b62d3a08a3b8bf83195cb78" gracePeriod=2 Sep 30 08:47:49 crc kubenswrapper[4760]: I0930 08:47:49.452819 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tjgq8" Sep 30 08:47:49 crc kubenswrapper[4760]: I0930 08:47:49.573080 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfc17a09-a2e4-415d-800f-b1ccbae0b470-utilities\") pod \"dfc17a09-a2e4-415d-800f-b1ccbae0b470\" (UID: \"dfc17a09-a2e4-415d-800f-b1ccbae0b470\") " Sep 30 08:47:49 crc kubenswrapper[4760]: I0930 08:47:49.573128 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjvjr\" (UniqueName: \"kubernetes.io/projected/dfc17a09-a2e4-415d-800f-b1ccbae0b470-kube-api-access-sjvjr\") pod \"dfc17a09-a2e4-415d-800f-b1ccbae0b470\" (UID: \"dfc17a09-a2e4-415d-800f-b1ccbae0b470\") " Sep 30 08:47:49 crc kubenswrapper[4760]: I0930 08:47:49.573381 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfc17a09-a2e4-415d-800f-b1ccbae0b470-catalog-content\") pod \"dfc17a09-a2e4-415d-800f-b1ccbae0b470\" (UID: \"dfc17a09-a2e4-415d-800f-b1ccbae0b470\") " Sep 30 08:47:49 crc kubenswrapper[4760]: I0930 08:47:49.575517 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfc17a09-a2e4-415d-800f-b1ccbae0b470-utilities" (OuterVolumeSpecName: "utilities") pod "dfc17a09-a2e4-415d-800f-b1ccbae0b470" (UID: "dfc17a09-a2e4-415d-800f-b1ccbae0b470"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:47:49 crc kubenswrapper[4760]: I0930 08:47:49.581584 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfc17a09-a2e4-415d-800f-b1ccbae0b470-kube-api-access-sjvjr" (OuterVolumeSpecName: "kube-api-access-sjvjr") pod "dfc17a09-a2e4-415d-800f-b1ccbae0b470" (UID: "dfc17a09-a2e4-415d-800f-b1ccbae0b470"). InnerVolumeSpecName "kube-api-access-sjvjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:47:49 crc kubenswrapper[4760]: I0930 08:47:49.592972 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfc17a09-a2e4-415d-800f-b1ccbae0b470-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dfc17a09-a2e4-415d-800f-b1ccbae0b470" (UID: "dfc17a09-a2e4-415d-800f-b1ccbae0b470"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:47:49 crc kubenswrapper[4760]: I0930 08:47:49.675887 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfc17a09-a2e4-415d-800f-b1ccbae0b470-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 08:47:49 crc kubenswrapper[4760]: I0930 08:47:49.675941 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjvjr\" (UniqueName: \"kubernetes.io/projected/dfc17a09-a2e4-415d-800f-b1ccbae0b470-kube-api-access-sjvjr\") on node \"crc\" DevicePath \"\"" Sep 30 08:47:49 crc kubenswrapper[4760]: I0930 08:47:49.675955 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfc17a09-a2e4-415d-800f-b1ccbae0b470-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 08:47:49 crc kubenswrapper[4760]: I0930 08:47:49.844870 4760 generic.go:334] "Generic (PLEG): container finished" podID="dfc17a09-a2e4-415d-800f-b1ccbae0b470" containerID="840b29d5fea8a75d4658d19a74698d5990d1cd8a0b62d3a08a3b8bf83195cb78" exitCode=0 Sep 30 08:47:49 crc kubenswrapper[4760]: I0930 08:47:49.844986 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjgq8" event={"ID":"dfc17a09-a2e4-415d-800f-b1ccbae0b470","Type":"ContainerDied","Data":"840b29d5fea8a75d4658d19a74698d5990d1cd8a0b62d3a08a3b8bf83195cb78"} Sep 30 08:47:49 crc kubenswrapper[4760]: I0930 08:47:49.845042 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tjgq8" Sep 30 08:47:49 crc kubenswrapper[4760]: I0930 08:47:49.845071 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjgq8" event={"ID":"dfc17a09-a2e4-415d-800f-b1ccbae0b470","Type":"ContainerDied","Data":"b3d030ecd20d55d34f2e52598346c490666f9cc021d92e233eaed0bb972e0e3e"} Sep 30 08:47:49 crc kubenswrapper[4760]: I0930 08:47:49.845113 4760 scope.go:117] "RemoveContainer" containerID="840b29d5fea8a75d4658d19a74698d5990d1cd8a0b62d3a08a3b8bf83195cb78" Sep 30 08:47:49 crc kubenswrapper[4760]: I0930 08:47:49.884495 4760 scope.go:117] "RemoveContainer" containerID="510afbc758e4e928d36508956678624e9356333096860ea466a589dfefdd09e0" Sep 30 08:47:49 crc kubenswrapper[4760]: I0930 08:47:49.909443 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjgq8"] Sep 30 08:47:49 crc kubenswrapper[4760]: I0930 08:47:49.923464 4760 scope.go:117] "RemoveContainer" containerID="f96472b5c52703b69fcf6a1ac184dabad6eaf042e7e275886fff27826915474d" Sep 30 08:47:49 crc kubenswrapper[4760]: I0930 08:47:49.926273 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjgq8"] Sep 30 08:47:50 crc kubenswrapper[4760]: I0930 08:47:50.000007 4760 scope.go:117] "RemoveContainer" containerID="840b29d5fea8a75d4658d19a74698d5990d1cd8a0b62d3a08a3b8bf83195cb78" Sep 30 08:47:50 crc kubenswrapper[4760]: E0930 08:47:50.001154 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"840b29d5fea8a75d4658d19a74698d5990d1cd8a0b62d3a08a3b8bf83195cb78\": container with ID starting with 840b29d5fea8a75d4658d19a74698d5990d1cd8a0b62d3a08a3b8bf83195cb78 not found: ID does not exist" containerID="840b29d5fea8a75d4658d19a74698d5990d1cd8a0b62d3a08a3b8bf83195cb78" Sep 30 08:47:50 crc kubenswrapper[4760]: I0930 08:47:50.001276 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"840b29d5fea8a75d4658d19a74698d5990d1cd8a0b62d3a08a3b8bf83195cb78"} err="failed to get container status \"840b29d5fea8a75d4658d19a74698d5990d1cd8a0b62d3a08a3b8bf83195cb78\": rpc error: code = NotFound desc = could not find container \"840b29d5fea8a75d4658d19a74698d5990d1cd8a0b62d3a08a3b8bf83195cb78\": container with ID starting with 840b29d5fea8a75d4658d19a74698d5990d1cd8a0b62d3a08a3b8bf83195cb78 not found: ID does not exist" Sep 30 08:47:50 crc kubenswrapper[4760]: I0930 08:47:50.001345 4760 scope.go:117] "RemoveContainer" containerID="510afbc758e4e928d36508956678624e9356333096860ea466a589dfefdd09e0" Sep 30 08:47:50 crc kubenswrapper[4760]: E0930 08:47:50.002178 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"510afbc758e4e928d36508956678624e9356333096860ea466a589dfefdd09e0\": container with ID starting with 510afbc758e4e928d36508956678624e9356333096860ea466a589dfefdd09e0 not found: ID does not exist" containerID="510afbc758e4e928d36508956678624e9356333096860ea466a589dfefdd09e0" Sep 30 08:47:50 crc kubenswrapper[4760]: I0930 08:47:50.002247 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"510afbc758e4e928d36508956678624e9356333096860ea466a589dfefdd09e0"} err="failed to get container status \"510afbc758e4e928d36508956678624e9356333096860ea466a589dfefdd09e0\": rpc error: code = NotFound desc = could not find container \"510afbc758e4e928d36508956678624e9356333096860ea466a589dfefdd09e0\": container with ID starting with 510afbc758e4e928d36508956678624e9356333096860ea466a589dfefdd09e0 not found: ID does not exist" Sep 30 08:47:50 crc kubenswrapper[4760]: I0930 08:47:50.002290 4760 scope.go:117] "RemoveContainer" containerID="f96472b5c52703b69fcf6a1ac184dabad6eaf042e7e275886fff27826915474d" Sep 30 08:47:50 crc kubenswrapper[4760]: E0930 08:47:50.005596 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f96472b5c52703b69fcf6a1ac184dabad6eaf042e7e275886fff27826915474d\": container with ID starting with f96472b5c52703b69fcf6a1ac184dabad6eaf042e7e275886fff27826915474d not found: ID does not exist" containerID="f96472b5c52703b69fcf6a1ac184dabad6eaf042e7e275886fff27826915474d" Sep 30 08:47:50 crc kubenswrapper[4760]: I0930 08:47:50.005650 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f96472b5c52703b69fcf6a1ac184dabad6eaf042e7e275886fff27826915474d"} err="failed to get container status \"f96472b5c52703b69fcf6a1ac184dabad6eaf042e7e275886fff27826915474d\": rpc error: code = NotFound desc = could not find container \"f96472b5c52703b69fcf6a1ac184dabad6eaf042e7e275886fff27826915474d\": container with ID starting with f96472b5c52703b69fcf6a1ac184dabad6eaf042e7e275886fff27826915474d not found: ID does not exist" Sep 30 08:47:50 crc kubenswrapper[4760]: I0930 08:47:50.067131 4760 scope.go:117] "RemoveContainer" containerID="131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" Sep 30 08:47:50 crc kubenswrapper[4760]: E0930 08:47:50.067464 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:47:51 crc kubenswrapper[4760]: I0930 08:47:51.087935 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfc17a09-a2e4-415d-800f-b1ccbae0b470" path="/var/lib/kubelet/pods/dfc17a09-a2e4-415d-800f-b1ccbae0b470/volumes" Sep 30 08:48:01 crc kubenswrapper[4760]: I0930 08:48:01.071862 4760 scope.go:117] "RemoveContainer" containerID="131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" Sep 30 08:48:01 crc kubenswrapper[4760]: E0930 08:48:01.073198 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:48:15 crc kubenswrapper[4760]: I0930 08:48:15.075541 4760 scope.go:117] "RemoveContainer" containerID="131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" Sep 30 08:48:15 crc kubenswrapper[4760]: E0930 08:48:15.076490 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:48:27 crc kubenswrapper[4760]: I0930 08:48:27.067129 4760 scope.go:117] "RemoveContainer" containerID="131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" Sep 30 08:48:27 crc kubenswrapper[4760]: E0930 08:48:27.068606 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:48:39 crc kubenswrapper[4760]: I0930 08:48:39.067141 4760 scope.go:117] "RemoveContainer" containerID="131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" Sep 30 08:48:39 crc kubenswrapper[4760]: E0930 08:48:39.068202 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:48:50 crc kubenswrapper[4760]: I0930 08:48:50.067600 4760 scope.go:117] "RemoveContainer" containerID="131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" Sep 30 08:48:50 crc kubenswrapper[4760]: E0930 08:48:50.068922 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:49:01 crc kubenswrapper[4760]: I0930 08:49:01.067092 4760 scope.go:117] "RemoveContainer" containerID="131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" Sep 30 08:49:01 crc kubenswrapper[4760]: E0930 08:49:01.068450 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:49:13 crc kubenswrapper[4760]: E0930 08:49:13.099713 4760 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.201:55904->38.102.83.201:37703: write tcp 38.102.83.201:55904->38.102.83.201:37703: write: broken pipe Sep 30 08:49:15 crc kubenswrapper[4760]: I0930 08:49:15.073335 4760 scope.go:117] "RemoveContainer" containerID="131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" Sep 30 08:49:15 crc kubenswrapper[4760]: E0930 08:49:15.073919 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:49:26 crc kubenswrapper[4760]: I0930 08:49:26.067509 4760 scope.go:117] "RemoveContainer" containerID="131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" Sep 30 08:49:26 crc kubenswrapper[4760]: E0930 08:49:26.068787 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:49:41 crc kubenswrapper[4760]: I0930 08:49:41.068249 4760 scope.go:117] "RemoveContainer" containerID="131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" Sep 30 08:49:41 crc kubenswrapper[4760]: E0930 08:49:41.069256 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:49:55 crc kubenswrapper[4760]: I0930 08:49:55.067926 4760 scope.go:117] "RemoveContainer" containerID="131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" Sep 30 08:49:56 crc kubenswrapper[4760]: I0930 08:49:56.313887 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerStarted","Data":"5cc7325ef3984c6f6c10181f46205a3d7fa4663e610bd26eb793a2c4e08bbdcb"} Sep 30 08:52:19 crc kubenswrapper[4760]: I0930 08:52:19.113009 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:52:19 crc kubenswrapper[4760]: I0930 08:52:19.113813 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:52:42 crc kubenswrapper[4760]: I0930 08:52:42.193692 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wh9l2"] Sep 30 08:52:42 crc kubenswrapper[4760]: E0930 08:52:42.194992 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfc17a09-a2e4-415d-800f-b1ccbae0b470" containerName="extract-utilities" Sep 30 08:52:42 crc kubenswrapper[4760]: I0930 08:52:42.195016 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfc17a09-a2e4-415d-800f-b1ccbae0b470" containerName="extract-utilities" Sep 30 08:52:42 crc kubenswrapper[4760]: E0930 08:52:42.195057 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfc17a09-a2e4-415d-800f-b1ccbae0b470" containerName="registry-server" Sep 30 08:52:42 crc kubenswrapper[4760]: I0930 08:52:42.195069 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfc17a09-a2e4-415d-800f-b1ccbae0b470" containerName="registry-server" Sep 30 08:52:42 crc kubenswrapper[4760]: E0930 08:52:42.195097 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfc17a09-a2e4-415d-800f-b1ccbae0b470" containerName="extract-content" Sep 30 08:52:42 crc kubenswrapper[4760]: I0930 08:52:42.195109 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfc17a09-a2e4-415d-800f-b1ccbae0b470" containerName="extract-content" Sep 30 08:52:42 crc kubenswrapper[4760]: I0930 08:52:42.195493 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfc17a09-a2e4-415d-800f-b1ccbae0b470" containerName="registry-server" Sep 30 08:52:42 crc kubenswrapper[4760]: I0930 08:52:42.198487 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wh9l2" Sep 30 08:52:42 crc kubenswrapper[4760]: I0930 08:52:42.209361 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wh9l2"] Sep 30 08:52:42 crc kubenswrapper[4760]: I0930 08:52:42.303189 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bxzl\" (UniqueName: \"kubernetes.io/projected/aaececc4-56fa-4aba-959e-0595d2cb7270-kube-api-access-9bxzl\") pod \"redhat-operators-wh9l2\" (UID: \"aaececc4-56fa-4aba-959e-0595d2cb7270\") " pod="openshift-marketplace/redhat-operators-wh9l2" Sep 30 08:52:42 crc kubenswrapper[4760]: I0930 08:52:42.303321 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaececc4-56fa-4aba-959e-0595d2cb7270-catalog-content\") pod \"redhat-operators-wh9l2\" (UID: \"aaececc4-56fa-4aba-959e-0595d2cb7270\") " pod="openshift-marketplace/redhat-operators-wh9l2" Sep 30 08:52:42 crc kubenswrapper[4760]: I0930 08:52:42.303496 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaececc4-56fa-4aba-959e-0595d2cb7270-utilities\") pod \"redhat-operators-wh9l2\" (UID: \"aaececc4-56fa-4aba-959e-0595d2cb7270\") " pod="openshift-marketplace/redhat-operators-wh9l2" Sep 30 08:52:42 crc kubenswrapper[4760]: I0930 08:52:42.405927 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaececc4-56fa-4aba-959e-0595d2cb7270-utilities\") pod \"redhat-operators-wh9l2\" (UID: \"aaececc4-56fa-4aba-959e-0595d2cb7270\") " pod="openshift-marketplace/redhat-operators-wh9l2" Sep 30 08:52:42 crc kubenswrapper[4760]: I0930 08:52:42.406123 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bxzl\" (UniqueName: \"kubernetes.io/projected/aaececc4-56fa-4aba-959e-0595d2cb7270-kube-api-access-9bxzl\") pod \"redhat-operators-wh9l2\" (UID: \"aaececc4-56fa-4aba-959e-0595d2cb7270\") " pod="openshift-marketplace/redhat-operators-wh9l2" Sep 30 08:52:42 crc kubenswrapper[4760]: I0930 08:52:42.406180 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaececc4-56fa-4aba-959e-0595d2cb7270-catalog-content\") pod \"redhat-operators-wh9l2\" (UID: \"aaececc4-56fa-4aba-959e-0595d2cb7270\") " pod="openshift-marketplace/redhat-operators-wh9l2" Sep 30 08:52:42 crc kubenswrapper[4760]: I0930 08:52:42.406824 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaececc4-56fa-4aba-959e-0595d2cb7270-utilities\") pod \"redhat-operators-wh9l2\" (UID: \"aaececc4-56fa-4aba-959e-0595d2cb7270\") " pod="openshift-marketplace/redhat-operators-wh9l2" Sep 30 08:52:42 crc kubenswrapper[4760]: I0930 08:52:42.406861 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaececc4-56fa-4aba-959e-0595d2cb7270-catalog-content\") pod \"redhat-operators-wh9l2\" (UID: \"aaececc4-56fa-4aba-959e-0595d2cb7270\") " pod="openshift-marketplace/redhat-operators-wh9l2" Sep 30 08:52:42 crc kubenswrapper[4760]: I0930 08:52:42.436935 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bxzl\" (UniqueName: \"kubernetes.io/projected/aaececc4-56fa-4aba-959e-0595d2cb7270-kube-api-access-9bxzl\") pod \"redhat-operators-wh9l2\" (UID: \"aaececc4-56fa-4aba-959e-0595d2cb7270\") " pod="openshift-marketplace/redhat-operators-wh9l2" Sep 30 08:52:42 crc kubenswrapper[4760]: I0930 08:52:42.530039 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wh9l2" Sep 30 08:52:43 crc kubenswrapper[4760]: I0930 08:52:43.093446 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wh9l2"] Sep 30 08:52:43 crc kubenswrapper[4760]: I0930 08:52:43.239244 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wh9l2" event={"ID":"aaececc4-56fa-4aba-959e-0595d2cb7270","Type":"ContainerStarted","Data":"c16406fd288fbf31cca89feb395a972de9924f987a661fef060980b58b6fe71e"} Sep 30 08:52:44 crc kubenswrapper[4760]: I0930 08:52:44.255813 4760 generic.go:334] "Generic (PLEG): container finished" podID="aaececc4-56fa-4aba-959e-0595d2cb7270" containerID="94e9e0b61fd21dad4f7e6f2b493f40297f889c9ea95ff5733c62d1a2f9ae5cb2" exitCode=0 Sep 30 08:52:44 crc kubenswrapper[4760]: I0930 08:52:44.255942 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wh9l2" event={"ID":"aaececc4-56fa-4aba-959e-0595d2cb7270","Type":"ContainerDied","Data":"94e9e0b61fd21dad4f7e6f2b493f40297f889c9ea95ff5733c62d1a2f9ae5cb2"} Sep 30 08:52:44 crc kubenswrapper[4760]: I0930 08:52:44.259887 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 08:52:49 crc kubenswrapper[4760]: I0930 08:52:49.113112 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:52:49 crc kubenswrapper[4760]: I0930 08:52:49.114209 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:52:52 crc kubenswrapper[4760]: I0930 08:52:52.361862 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wh9l2" event={"ID":"aaececc4-56fa-4aba-959e-0595d2cb7270","Type":"ContainerStarted","Data":"46bcceec9021bb33c154681a581332ad26ef79e6307822a13616c86cc2bb441e"} Sep 30 08:52:54 crc kubenswrapper[4760]: I0930 08:52:54.406396 4760 generic.go:334] "Generic (PLEG): container finished" podID="aaececc4-56fa-4aba-959e-0595d2cb7270" containerID="46bcceec9021bb33c154681a581332ad26ef79e6307822a13616c86cc2bb441e" exitCode=0 Sep 30 08:52:54 crc kubenswrapper[4760]: I0930 08:52:54.406477 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wh9l2" event={"ID":"aaececc4-56fa-4aba-959e-0595d2cb7270","Type":"ContainerDied","Data":"46bcceec9021bb33c154681a581332ad26ef79e6307822a13616c86cc2bb441e"} Sep 30 08:52:55 crc kubenswrapper[4760]: I0930 08:52:55.425606 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wh9l2" event={"ID":"aaececc4-56fa-4aba-959e-0595d2cb7270","Type":"ContainerStarted","Data":"49465304b03ffe3db45cfbbdfd604486ed18b74e7291ac2289b743f624096a3c"} Sep 30 08:52:55 crc kubenswrapper[4760]: I0930 08:52:55.461129 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wh9l2" podStartSLOduration=2.876390443 podStartE2EDuration="13.461103863s" podCreationTimestamp="2025-09-30 08:52:42 +0000 UTC" firstStartedPulling="2025-09-30 08:52:44.25936222 +0000 UTC m=+4749.902268662" lastFinishedPulling="2025-09-30 08:52:54.84407564 +0000 UTC m=+4760.486982082" observedRunningTime="2025-09-30 08:52:55.452166415 +0000 UTC m=+4761.095072867" watchObservedRunningTime="2025-09-30 08:52:55.461103863 +0000 UTC m=+4761.104010285" Sep 30 08:53:02 crc kubenswrapper[4760]: I0930 08:53:02.532545 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wh9l2" Sep 30 08:53:02 crc kubenswrapper[4760]: I0930 08:53:02.533189 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wh9l2" Sep 30 08:53:02 crc kubenswrapper[4760]: I0930 08:53:02.591393 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wh9l2" Sep 30 08:53:03 crc kubenswrapper[4760]: I0930 08:53:03.593159 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wh9l2" Sep 30 08:53:03 crc kubenswrapper[4760]: I0930 08:53:03.688166 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wh9l2"] Sep 30 08:53:03 crc kubenswrapper[4760]: I0930 08:53:03.749394 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kg7zj"] Sep 30 08:53:03 crc kubenswrapper[4760]: I0930 08:53:03.749773 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kg7zj" podUID="f8187248-878a-494d-bfce-56d54c403561" containerName="registry-server" containerID="cri-o://b19e6b3b568d4063aa403ade20d9863e4b5eab3541306779c904ce4508822fdb" gracePeriod=2 Sep 30 08:53:03 crc kubenswrapper[4760]: E0930 08:53:03.896077 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8187248_878a_494d_bfce_56d54c403561.slice/crio-b19e6b3b568d4063aa403ade20d9863e4b5eab3541306779c904ce4508822fdb.scope\": RecentStats: unable to find data in memory cache]" Sep 30 08:53:04 crc kubenswrapper[4760]: I0930 08:53:04.245571 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kg7zj" Sep 30 08:53:04 crc kubenswrapper[4760]: I0930 08:53:04.309484 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd7gj\" (UniqueName: \"kubernetes.io/projected/f8187248-878a-494d-bfce-56d54c403561-kube-api-access-gd7gj\") pod \"f8187248-878a-494d-bfce-56d54c403561\" (UID: \"f8187248-878a-494d-bfce-56d54c403561\") " Sep 30 08:53:04 crc kubenswrapper[4760]: I0930 08:53:04.310147 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8187248-878a-494d-bfce-56d54c403561-catalog-content\") pod \"f8187248-878a-494d-bfce-56d54c403561\" (UID: \"f8187248-878a-494d-bfce-56d54c403561\") " Sep 30 08:53:04 crc kubenswrapper[4760]: I0930 08:53:04.317088 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8187248-878a-494d-bfce-56d54c403561-utilities\") pod \"f8187248-878a-494d-bfce-56d54c403561\" (UID: \"f8187248-878a-494d-bfce-56d54c403561\") " Sep 30 08:53:04 crc kubenswrapper[4760]: I0930 08:53:04.316478 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8187248-878a-494d-bfce-56d54c403561-kube-api-access-gd7gj" (OuterVolumeSpecName: "kube-api-access-gd7gj") pod "f8187248-878a-494d-bfce-56d54c403561" (UID: "f8187248-878a-494d-bfce-56d54c403561"). InnerVolumeSpecName "kube-api-access-gd7gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:53:04 crc kubenswrapper[4760]: I0930 08:53:04.319292 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8187248-878a-494d-bfce-56d54c403561-utilities" (OuterVolumeSpecName: "utilities") pod "f8187248-878a-494d-bfce-56d54c403561" (UID: "f8187248-878a-494d-bfce-56d54c403561"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:53:04 crc kubenswrapper[4760]: I0930 08:53:04.321003 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8187248-878a-494d-bfce-56d54c403561-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 08:53:04 crc kubenswrapper[4760]: I0930 08:53:04.322469 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd7gj\" (UniqueName: \"kubernetes.io/projected/f8187248-878a-494d-bfce-56d54c403561-kube-api-access-gd7gj\") on node \"crc\" DevicePath \"\"" Sep 30 08:53:04 crc kubenswrapper[4760]: I0930 08:53:04.392066 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8187248-878a-494d-bfce-56d54c403561-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f8187248-878a-494d-bfce-56d54c403561" (UID: "f8187248-878a-494d-bfce-56d54c403561"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:53:04 crc kubenswrapper[4760]: I0930 08:53:04.423836 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8187248-878a-494d-bfce-56d54c403561-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 08:53:04 crc kubenswrapper[4760]: I0930 08:53:04.539657 4760 generic.go:334] "Generic (PLEG): container finished" podID="f8187248-878a-494d-bfce-56d54c403561" containerID="b19e6b3b568d4063aa403ade20d9863e4b5eab3541306779c904ce4508822fdb" exitCode=0 Sep 30 08:53:04 crc kubenswrapper[4760]: I0930 08:53:04.539731 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kg7zj" Sep 30 08:53:04 crc kubenswrapper[4760]: I0930 08:53:04.539756 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kg7zj" event={"ID":"f8187248-878a-494d-bfce-56d54c403561","Type":"ContainerDied","Data":"b19e6b3b568d4063aa403ade20d9863e4b5eab3541306779c904ce4508822fdb"} Sep 30 08:53:04 crc kubenswrapper[4760]: I0930 08:53:04.539817 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kg7zj" event={"ID":"f8187248-878a-494d-bfce-56d54c403561","Type":"ContainerDied","Data":"0c236aa7ac853c433b90119d1acbc78f45c721bbd60406ec5932fcd029e1f175"} Sep 30 08:53:04 crc kubenswrapper[4760]: I0930 08:53:04.539841 4760 scope.go:117] "RemoveContainer" containerID="b19e6b3b568d4063aa403ade20d9863e4b5eab3541306779c904ce4508822fdb" Sep 30 08:53:04 crc kubenswrapper[4760]: I0930 08:53:04.570644 4760 scope.go:117] "RemoveContainer" containerID="98c55b347535fa5f29d172bfd33190398cd759ac6a448c0e2f3170761063ca3b" Sep 30 08:53:04 crc kubenswrapper[4760]: I0930 08:53:04.581942 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kg7zj"] Sep 30 08:53:04 crc kubenswrapper[4760]: I0930 08:53:04.592377 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kg7zj"] Sep 30 08:53:04 crc kubenswrapper[4760]: I0930 08:53:04.600424 4760 scope.go:117] "RemoveContainer" containerID="5129762066b843dcdfca0b71d08c258a4196036d9716454e89a3d73e9208e1ed" Sep 30 08:53:04 crc kubenswrapper[4760]: I0930 08:53:04.644777 4760 scope.go:117] "RemoveContainer" containerID="b19e6b3b568d4063aa403ade20d9863e4b5eab3541306779c904ce4508822fdb" Sep 30 08:53:04 crc kubenswrapper[4760]: E0930 08:53:04.645147 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b19e6b3b568d4063aa403ade20d9863e4b5eab3541306779c904ce4508822fdb\": container with ID starting with b19e6b3b568d4063aa403ade20d9863e4b5eab3541306779c904ce4508822fdb not found: ID does not exist" containerID="b19e6b3b568d4063aa403ade20d9863e4b5eab3541306779c904ce4508822fdb" Sep 30 08:53:04 crc kubenswrapper[4760]: I0930 08:53:04.645179 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b19e6b3b568d4063aa403ade20d9863e4b5eab3541306779c904ce4508822fdb"} err="failed to get container status \"b19e6b3b568d4063aa403ade20d9863e4b5eab3541306779c904ce4508822fdb\": rpc error: code = NotFound desc = could not find container \"b19e6b3b568d4063aa403ade20d9863e4b5eab3541306779c904ce4508822fdb\": container with ID starting with b19e6b3b568d4063aa403ade20d9863e4b5eab3541306779c904ce4508822fdb not found: ID does not exist" Sep 30 08:53:04 crc kubenswrapper[4760]: I0930 08:53:04.645198 4760 scope.go:117] "RemoveContainer" containerID="98c55b347535fa5f29d172bfd33190398cd759ac6a448c0e2f3170761063ca3b" Sep 30 08:53:04 crc kubenswrapper[4760]: E0930 08:53:04.645481 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98c55b347535fa5f29d172bfd33190398cd759ac6a448c0e2f3170761063ca3b\": container with ID starting with 98c55b347535fa5f29d172bfd33190398cd759ac6a448c0e2f3170761063ca3b not found: ID does not exist" containerID="98c55b347535fa5f29d172bfd33190398cd759ac6a448c0e2f3170761063ca3b" Sep 30 08:53:04 crc kubenswrapper[4760]: I0930 08:53:04.645502 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98c55b347535fa5f29d172bfd33190398cd759ac6a448c0e2f3170761063ca3b"} err="failed to get container status \"98c55b347535fa5f29d172bfd33190398cd759ac6a448c0e2f3170761063ca3b\": rpc error: code = NotFound desc = could not find container \"98c55b347535fa5f29d172bfd33190398cd759ac6a448c0e2f3170761063ca3b\": container with ID starting with 98c55b347535fa5f29d172bfd33190398cd759ac6a448c0e2f3170761063ca3b not found: ID does not exist" Sep 30 08:53:04 crc kubenswrapper[4760]: I0930 08:53:04.645515 4760 scope.go:117] "RemoveContainer" containerID="5129762066b843dcdfca0b71d08c258a4196036d9716454e89a3d73e9208e1ed" Sep 30 08:53:04 crc kubenswrapper[4760]: E0930 08:53:04.645849 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5129762066b843dcdfca0b71d08c258a4196036d9716454e89a3d73e9208e1ed\": container with ID starting with 5129762066b843dcdfca0b71d08c258a4196036d9716454e89a3d73e9208e1ed not found: ID does not exist" containerID="5129762066b843dcdfca0b71d08c258a4196036d9716454e89a3d73e9208e1ed" Sep 30 08:53:04 crc kubenswrapper[4760]: I0930 08:53:04.645871 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5129762066b843dcdfca0b71d08c258a4196036d9716454e89a3d73e9208e1ed"} err="failed to get container status \"5129762066b843dcdfca0b71d08c258a4196036d9716454e89a3d73e9208e1ed\": rpc error: code = NotFound desc = could not find container \"5129762066b843dcdfca0b71d08c258a4196036d9716454e89a3d73e9208e1ed\": container with ID starting with 5129762066b843dcdfca0b71d08c258a4196036d9716454e89a3d73e9208e1ed not found: ID does not exist" Sep 30 08:53:05 crc kubenswrapper[4760]: I0930 08:53:05.081333 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8187248-878a-494d-bfce-56d54c403561" path="/var/lib/kubelet/pods/f8187248-878a-494d-bfce-56d54c403561/volumes" Sep 30 08:53:19 crc kubenswrapper[4760]: I0930 08:53:19.113450 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:53:19 crc kubenswrapper[4760]: I0930 08:53:19.113952 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:53:19 crc kubenswrapper[4760]: I0930 08:53:19.113997 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 08:53:19 crc kubenswrapper[4760]: I0930 08:53:19.114691 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5cc7325ef3984c6f6c10181f46205a3d7fa4663e610bd26eb793a2c4e08bbdcb"} pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 08:53:19 crc kubenswrapper[4760]: I0930 08:53:19.114745 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" containerID="cri-o://5cc7325ef3984c6f6c10181f46205a3d7fa4663e610bd26eb793a2c4e08bbdcb" gracePeriod=600 Sep 30 08:53:19 crc kubenswrapper[4760]: I0930 08:53:19.730821 4760 generic.go:334] "Generic (PLEG): container finished" podID="7a9c8270-6964-4886-87d0-227b05b76da4" containerID="5cc7325ef3984c6f6c10181f46205a3d7fa4663e610bd26eb793a2c4e08bbdcb" exitCode=0 Sep 30 08:53:19 crc kubenswrapper[4760]: I0930 08:53:19.730925 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerDied","Data":"5cc7325ef3984c6f6c10181f46205a3d7fa4663e610bd26eb793a2c4e08bbdcb"} Sep 30 08:53:19 crc kubenswrapper[4760]: I0930 08:53:19.731394 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerStarted","Data":"bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc"} Sep 30 08:53:19 crc kubenswrapper[4760]: I0930 08:53:19.731416 4760 scope.go:117] "RemoveContainer" containerID="131b74da1ab5f542facd8050213c6315cd36b2dcf16917068bb69bf076b6760b" Sep 30 08:55:19 crc kubenswrapper[4760]: I0930 08:55:19.112482 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:55:19 crc kubenswrapper[4760]: I0930 08:55:19.112989 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:55:49 crc kubenswrapper[4760]: I0930 08:55:49.112953 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:55:49 crc kubenswrapper[4760]: I0930 08:55:49.113720 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:56:19 crc kubenswrapper[4760]: I0930 08:56:19.114055 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:56:19 crc kubenswrapper[4760]: I0930 08:56:19.115046 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 08:56:19 crc kubenswrapper[4760]: I0930 08:56:19.115137 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 08:56:19 crc kubenswrapper[4760]: I0930 08:56:19.117026 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc"} pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 08:56:19 crc kubenswrapper[4760]: I0930 08:56:19.117112 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" containerID="cri-o://bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc" gracePeriod=600 Sep 30 08:56:19 crc kubenswrapper[4760]: E0930 08:56:19.265459 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:56:19 crc kubenswrapper[4760]: I0930 08:56:19.841875 4760 generic.go:334] "Generic (PLEG): container finished" podID="7a9c8270-6964-4886-87d0-227b05b76da4" containerID="bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc" exitCode=0 Sep 30 08:56:19 crc kubenswrapper[4760]: I0930 08:56:19.842118 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerDied","Data":"bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc"} Sep 30 08:56:19 crc kubenswrapper[4760]: I0930 08:56:19.842378 4760 scope.go:117] "RemoveContainer" containerID="5cc7325ef3984c6f6c10181f46205a3d7fa4663e610bd26eb793a2c4e08bbdcb" Sep 30 08:56:19 crc kubenswrapper[4760]: I0930 08:56:19.843085 4760 scope.go:117] "RemoveContainer" containerID="bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc" Sep 30 08:56:19 crc kubenswrapper[4760]: E0930 08:56:19.843705 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:56:35 crc kubenswrapper[4760]: I0930 08:56:35.078074 4760 scope.go:117] "RemoveContainer" containerID="bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc" Sep 30 08:56:35 crc kubenswrapper[4760]: E0930 08:56:35.080486 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:56:37 crc kubenswrapper[4760]: I0930 08:56:37.052340 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r2ddq"] Sep 30 08:56:37 crc kubenswrapper[4760]: E0930 08:56:37.053637 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8187248-878a-494d-bfce-56d54c403561" containerName="extract-content" Sep 30 08:56:37 crc kubenswrapper[4760]: I0930 08:56:37.053655 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8187248-878a-494d-bfce-56d54c403561" containerName="extract-content" Sep 30 08:56:37 crc kubenswrapper[4760]: E0930 08:56:37.053675 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8187248-878a-494d-bfce-56d54c403561" containerName="extract-utilities" Sep 30 08:56:37 crc kubenswrapper[4760]: I0930 08:56:37.053684 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8187248-878a-494d-bfce-56d54c403561" containerName="extract-utilities" Sep 30 08:56:37 crc kubenswrapper[4760]: E0930 08:56:37.053704 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8187248-878a-494d-bfce-56d54c403561" containerName="registry-server" Sep 30 08:56:37 crc kubenswrapper[4760]: I0930 08:56:37.053712 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8187248-878a-494d-bfce-56d54c403561" containerName="registry-server" Sep 30 08:56:37 crc kubenswrapper[4760]: I0930 08:56:37.053973 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8187248-878a-494d-bfce-56d54c403561" containerName="registry-server" Sep 30 08:56:37 crc kubenswrapper[4760]: I0930 08:56:37.056283 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r2ddq" Sep 30 08:56:37 crc kubenswrapper[4760]: I0930 08:56:37.063914 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r2ddq"] Sep 30 08:56:37 crc kubenswrapper[4760]: I0930 08:56:37.149659 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f8c0b7d-249f-4e49-b785-80fa2c617fe1-catalog-content\") pod \"certified-operators-r2ddq\" (UID: \"0f8c0b7d-249f-4e49-b785-80fa2c617fe1\") " pod="openshift-marketplace/certified-operators-r2ddq" Sep 30 08:56:37 crc kubenswrapper[4760]: I0930 08:56:37.149754 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px8j6\" (UniqueName: \"kubernetes.io/projected/0f8c0b7d-249f-4e49-b785-80fa2c617fe1-kube-api-access-px8j6\") pod \"certified-operators-r2ddq\" (UID: \"0f8c0b7d-249f-4e49-b785-80fa2c617fe1\") " pod="openshift-marketplace/certified-operators-r2ddq" Sep 30 08:56:37 crc kubenswrapper[4760]: I0930 08:56:37.150261 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f8c0b7d-249f-4e49-b785-80fa2c617fe1-utilities\") pod \"certified-operators-r2ddq\" (UID: \"0f8c0b7d-249f-4e49-b785-80fa2c617fe1\") " pod="openshift-marketplace/certified-operators-r2ddq" Sep 30 08:56:37 crc kubenswrapper[4760]: I0930 08:56:37.269667 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f8c0b7d-249f-4e49-b785-80fa2c617fe1-utilities\") pod \"certified-operators-r2ddq\" (UID: \"0f8c0b7d-249f-4e49-b785-80fa2c617fe1\") " pod="openshift-marketplace/certified-operators-r2ddq" Sep 30 08:56:37 crc kubenswrapper[4760]: I0930 08:56:37.269825 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f8c0b7d-249f-4e49-b785-80fa2c617fe1-catalog-content\") pod \"certified-operators-r2ddq\" (UID: \"0f8c0b7d-249f-4e49-b785-80fa2c617fe1\") " pod="openshift-marketplace/certified-operators-r2ddq" Sep 30 08:56:37 crc kubenswrapper[4760]: I0930 08:56:37.269900 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px8j6\" (UniqueName: \"kubernetes.io/projected/0f8c0b7d-249f-4e49-b785-80fa2c617fe1-kube-api-access-px8j6\") pod \"certified-operators-r2ddq\" (UID: \"0f8c0b7d-249f-4e49-b785-80fa2c617fe1\") " pod="openshift-marketplace/certified-operators-r2ddq" Sep 30 08:56:37 crc kubenswrapper[4760]: I0930 08:56:37.271219 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f8c0b7d-249f-4e49-b785-80fa2c617fe1-utilities\") pod \"certified-operators-r2ddq\" (UID: \"0f8c0b7d-249f-4e49-b785-80fa2c617fe1\") " pod="openshift-marketplace/certified-operators-r2ddq" Sep 30 08:56:37 crc kubenswrapper[4760]: I0930 08:56:37.274013 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f8c0b7d-249f-4e49-b785-80fa2c617fe1-catalog-content\") pod \"certified-operators-r2ddq\" (UID: \"0f8c0b7d-249f-4e49-b785-80fa2c617fe1\") " pod="openshift-marketplace/certified-operators-r2ddq" Sep 30 08:56:37 crc kubenswrapper[4760]: I0930 08:56:37.301997 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px8j6\" (UniqueName: \"kubernetes.io/projected/0f8c0b7d-249f-4e49-b785-80fa2c617fe1-kube-api-access-px8j6\") pod \"certified-operators-r2ddq\" (UID: \"0f8c0b7d-249f-4e49-b785-80fa2c617fe1\") " pod="openshift-marketplace/certified-operators-r2ddq" Sep 30 08:56:37 crc kubenswrapper[4760]: I0930 08:56:37.376134 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r2ddq" Sep 30 08:56:37 crc kubenswrapper[4760]: I0930 08:56:37.891730 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r2ddq"] Sep 30 08:56:39 crc kubenswrapper[4760]: I0930 08:56:39.121425 4760 generic.go:334] "Generic (PLEG): container finished" podID="0f8c0b7d-249f-4e49-b785-80fa2c617fe1" containerID="269b5ca3f12d0c209077ed0636324b2de692a6051be5b5a68b7cde21f30e485a" exitCode=0 Sep 30 08:56:39 crc kubenswrapper[4760]: I0930 08:56:39.121481 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2ddq" event={"ID":"0f8c0b7d-249f-4e49-b785-80fa2c617fe1","Type":"ContainerDied","Data":"269b5ca3f12d0c209077ed0636324b2de692a6051be5b5a68b7cde21f30e485a"} Sep 30 08:56:39 crc kubenswrapper[4760]: I0930 08:56:39.122095 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2ddq" event={"ID":"0f8c0b7d-249f-4e49-b785-80fa2c617fe1","Type":"ContainerStarted","Data":"4331b554d4a5c72082c972a2576f649cdc6f0ca9f27d7feb3f42d919a47e0fbf"} Sep 30 08:56:41 crc kubenswrapper[4760]: I0930 08:56:41.147087 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2ddq" event={"ID":"0f8c0b7d-249f-4e49-b785-80fa2c617fe1","Type":"ContainerStarted","Data":"8973593b68ed4b6d72558ebfa83fa264e7fcaa1671f01cfc3373ea3b4747244f"} Sep 30 08:56:42 crc kubenswrapper[4760]: I0930 08:56:42.160386 4760 generic.go:334] "Generic (PLEG): container finished" podID="0f8c0b7d-249f-4e49-b785-80fa2c617fe1" containerID="8973593b68ed4b6d72558ebfa83fa264e7fcaa1671f01cfc3373ea3b4747244f" exitCode=0 Sep 30 08:56:42 crc kubenswrapper[4760]: I0930 08:56:42.160449 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2ddq" event={"ID":"0f8c0b7d-249f-4e49-b785-80fa2c617fe1","Type":"ContainerDied","Data":"8973593b68ed4b6d72558ebfa83fa264e7fcaa1671f01cfc3373ea3b4747244f"} Sep 30 08:56:43 crc kubenswrapper[4760]: I0930 08:56:43.176709 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2ddq" event={"ID":"0f8c0b7d-249f-4e49-b785-80fa2c617fe1","Type":"ContainerStarted","Data":"6babcb70422f50bb99b3027ce2f1b925295816efdf5b956bc9b383b6838dd1e4"} Sep 30 08:56:43 crc kubenswrapper[4760]: I0930 08:56:43.205209 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r2ddq" podStartSLOduration=2.755283965 podStartE2EDuration="6.20517305s" podCreationTimestamp="2025-09-30 08:56:37 +0000 UTC" firstStartedPulling="2025-09-30 08:56:39.125048262 +0000 UTC m=+4984.767954684" lastFinishedPulling="2025-09-30 08:56:42.574937337 +0000 UTC m=+4988.217843769" observedRunningTime="2025-09-30 08:56:43.195254848 +0000 UTC m=+4988.838161300" watchObservedRunningTime="2025-09-30 08:56:43.20517305 +0000 UTC m=+4988.848079502" Sep 30 08:56:47 crc kubenswrapper[4760]: I0930 08:56:47.067764 4760 scope.go:117] "RemoveContainer" containerID="bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc" Sep 30 08:56:47 crc kubenswrapper[4760]: E0930 08:56:47.069242 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:56:47 crc kubenswrapper[4760]: I0930 08:56:47.377101 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r2ddq" Sep 30 08:56:47 crc kubenswrapper[4760]: I0930 08:56:47.377507 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r2ddq" Sep 30 08:56:47 crc kubenswrapper[4760]: I0930 08:56:47.463974 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r2ddq" Sep 30 08:56:48 crc kubenswrapper[4760]: I0930 08:56:48.287699 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r2ddq" Sep 30 08:56:48 crc kubenswrapper[4760]: I0930 08:56:48.362463 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r2ddq"] Sep 30 08:56:50 crc kubenswrapper[4760]: I0930 08:56:50.260560 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r2ddq" podUID="0f8c0b7d-249f-4e49-b785-80fa2c617fe1" containerName="registry-server" containerID="cri-o://6babcb70422f50bb99b3027ce2f1b925295816efdf5b956bc9b383b6838dd1e4" gracePeriod=2 Sep 30 08:56:50 crc kubenswrapper[4760]: E0930 08:56:50.351819 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f8c0b7d_249f_4e49_b785_80fa2c617fe1.slice/crio-conmon-6babcb70422f50bb99b3027ce2f1b925295816efdf5b956bc9b383b6838dd1e4.scope\": RecentStats: unable to find data in memory cache]" Sep 30 08:56:50 crc kubenswrapper[4760]: I0930 08:56:50.741378 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r2ddq" Sep 30 08:56:50 crc kubenswrapper[4760]: I0930 08:56:50.899412 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f8c0b7d-249f-4e49-b785-80fa2c617fe1-catalog-content\") pod \"0f8c0b7d-249f-4e49-b785-80fa2c617fe1\" (UID: \"0f8c0b7d-249f-4e49-b785-80fa2c617fe1\") " Sep 30 08:56:50 crc kubenswrapper[4760]: I0930 08:56:50.899470 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f8c0b7d-249f-4e49-b785-80fa2c617fe1-utilities\") pod \"0f8c0b7d-249f-4e49-b785-80fa2c617fe1\" (UID: \"0f8c0b7d-249f-4e49-b785-80fa2c617fe1\") " Sep 30 08:56:50 crc kubenswrapper[4760]: I0930 08:56:50.899738 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px8j6\" (UniqueName: \"kubernetes.io/projected/0f8c0b7d-249f-4e49-b785-80fa2c617fe1-kube-api-access-px8j6\") pod \"0f8c0b7d-249f-4e49-b785-80fa2c617fe1\" (UID: \"0f8c0b7d-249f-4e49-b785-80fa2c617fe1\") " Sep 30 08:56:50 crc kubenswrapper[4760]: I0930 08:56:50.902141 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f8c0b7d-249f-4e49-b785-80fa2c617fe1-utilities" (OuterVolumeSpecName: "utilities") pod "0f8c0b7d-249f-4e49-b785-80fa2c617fe1" (UID: "0f8c0b7d-249f-4e49-b785-80fa2c617fe1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:56:50 crc kubenswrapper[4760]: I0930 08:56:50.909667 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f8c0b7d-249f-4e49-b785-80fa2c617fe1-kube-api-access-px8j6" (OuterVolumeSpecName: "kube-api-access-px8j6") pod "0f8c0b7d-249f-4e49-b785-80fa2c617fe1" (UID: "0f8c0b7d-249f-4e49-b785-80fa2c617fe1"). InnerVolumeSpecName "kube-api-access-px8j6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:56:50 crc kubenswrapper[4760]: I0930 08:56:50.975537 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f8c0b7d-249f-4e49-b785-80fa2c617fe1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f8c0b7d-249f-4e49-b785-80fa2c617fe1" (UID: "0f8c0b7d-249f-4e49-b785-80fa2c617fe1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:56:51 crc kubenswrapper[4760]: I0930 08:56:51.002393 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px8j6\" (UniqueName: \"kubernetes.io/projected/0f8c0b7d-249f-4e49-b785-80fa2c617fe1-kube-api-access-px8j6\") on node \"crc\" DevicePath \"\"" Sep 30 08:56:51 crc kubenswrapper[4760]: I0930 08:56:51.002434 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f8c0b7d-249f-4e49-b785-80fa2c617fe1-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 08:56:51 crc kubenswrapper[4760]: I0930 08:56:51.002451 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f8c0b7d-249f-4e49-b785-80fa2c617fe1-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 08:56:51 crc kubenswrapper[4760]: I0930 08:56:51.278228 4760 generic.go:334] "Generic (PLEG): container finished" podID="0f8c0b7d-249f-4e49-b785-80fa2c617fe1" containerID="6babcb70422f50bb99b3027ce2f1b925295816efdf5b956bc9b383b6838dd1e4" exitCode=0 Sep 30 08:56:51 crc kubenswrapper[4760]: I0930 08:56:51.278383 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2ddq" event={"ID":"0f8c0b7d-249f-4e49-b785-80fa2c617fe1","Type":"ContainerDied","Data":"6babcb70422f50bb99b3027ce2f1b925295816efdf5b956bc9b383b6838dd1e4"} Sep 30 08:56:51 crc kubenswrapper[4760]: I0930 08:56:51.279412 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2ddq" event={"ID":"0f8c0b7d-249f-4e49-b785-80fa2c617fe1","Type":"ContainerDied","Data":"4331b554d4a5c72082c972a2576f649cdc6f0ca9f27d7feb3f42d919a47e0fbf"} Sep 30 08:56:51 crc kubenswrapper[4760]: I0930 08:56:51.279453 4760 scope.go:117] "RemoveContainer" containerID="6babcb70422f50bb99b3027ce2f1b925295816efdf5b956bc9b383b6838dd1e4" Sep 30 08:56:51 crc kubenswrapper[4760]: I0930 08:56:51.278459 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r2ddq" Sep 30 08:56:51 crc kubenswrapper[4760]: I0930 08:56:51.314227 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r2ddq"] Sep 30 08:56:51 crc kubenswrapper[4760]: I0930 08:56:51.327673 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r2ddq"] Sep 30 08:56:51 crc kubenswrapper[4760]: I0930 08:56:51.337015 4760 scope.go:117] "RemoveContainer" containerID="8973593b68ed4b6d72558ebfa83fa264e7fcaa1671f01cfc3373ea3b4747244f" Sep 30 08:56:51 crc kubenswrapper[4760]: I0930 08:56:51.374938 4760 scope.go:117] "RemoveContainer" containerID="269b5ca3f12d0c209077ed0636324b2de692a6051be5b5a68b7cde21f30e485a" Sep 30 08:56:51 crc kubenswrapper[4760]: I0930 08:56:51.406854 4760 scope.go:117] "RemoveContainer" containerID="6babcb70422f50bb99b3027ce2f1b925295816efdf5b956bc9b383b6838dd1e4" Sep 30 08:56:51 crc kubenswrapper[4760]: E0930 08:56:51.407882 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6babcb70422f50bb99b3027ce2f1b925295816efdf5b956bc9b383b6838dd1e4\": container with ID starting with 6babcb70422f50bb99b3027ce2f1b925295816efdf5b956bc9b383b6838dd1e4 not found: ID does not exist" containerID="6babcb70422f50bb99b3027ce2f1b925295816efdf5b956bc9b383b6838dd1e4" Sep 30 08:56:51 crc kubenswrapper[4760]: I0930 08:56:51.407924 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6babcb70422f50bb99b3027ce2f1b925295816efdf5b956bc9b383b6838dd1e4"} err="failed to get container status \"6babcb70422f50bb99b3027ce2f1b925295816efdf5b956bc9b383b6838dd1e4\": rpc error: code = NotFound desc = could not find container \"6babcb70422f50bb99b3027ce2f1b925295816efdf5b956bc9b383b6838dd1e4\": container with ID starting with 6babcb70422f50bb99b3027ce2f1b925295816efdf5b956bc9b383b6838dd1e4 not found: ID does not exist" Sep 30 08:56:51 crc kubenswrapper[4760]: I0930 08:56:51.407977 4760 scope.go:117] "RemoveContainer" containerID="8973593b68ed4b6d72558ebfa83fa264e7fcaa1671f01cfc3373ea3b4747244f" Sep 30 08:56:51 crc kubenswrapper[4760]: E0930 08:56:51.408508 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8973593b68ed4b6d72558ebfa83fa264e7fcaa1671f01cfc3373ea3b4747244f\": container with ID starting with 8973593b68ed4b6d72558ebfa83fa264e7fcaa1671f01cfc3373ea3b4747244f not found: ID does not exist" containerID="8973593b68ed4b6d72558ebfa83fa264e7fcaa1671f01cfc3373ea3b4747244f" Sep 30 08:56:51 crc kubenswrapper[4760]: I0930 08:56:51.408568 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8973593b68ed4b6d72558ebfa83fa264e7fcaa1671f01cfc3373ea3b4747244f"} err="failed to get container status \"8973593b68ed4b6d72558ebfa83fa264e7fcaa1671f01cfc3373ea3b4747244f\": rpc error: code = NotFound desc = could not find container \"8973593b68ed4b6d72558ebfa83fa264e7fcaa1671f01cfc3373ea3b4747244f\": container with ID starting with 8973593b68ed4b6d72558ebfa83fa264e7fcaa1671f01cfc3373ea3b4747244f not found: ID does not exist" Sep 30 08:56:51 crc kubenswrapper[4760]: I0930 08:56:51.408611 4760 scope.go:117] "RemoveContainer" containerID="269b5ca3f12d0c209077ed0636324b2de692a6051be5b5a68b7cde21f30e485a" Sep 30 08:56:51 crc kubenswrapper[4760]: E0930 08:56:51.409112 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"269b5ca3f12d0c209077ed0636324b2de692a6051be5b5a68b7cde21f30e485a\": container with ID starting with 269b5ca3f12d0c209077ed0636324b2de692a6051be5b5a68b7cde21f30e485a not found: ID does not exist" containerID="269b5ca3f12d0c209077ed0636324b2de692a6051be5b5a68b7cde21f30e485a" Sep 30 08:56:51 crc kubenswrapper[4760]: I0930 08:56:51.409144 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"269b5ca3f12d0c209077ed0636324b2de692a6051be5b5a68b7cde21f30e485a"} err="failed to get container status \"269b5ca3f12d0c209077ed0636324b2de692a6051be5b5a68b7cde21f30e485a\": rpc error: code = NotFound desc = could not find container \"269b5ca3f12d0c209077ed0636324b2de692a6051be5b5a68b7cde21f30e485a\": container with ID starting with 269b5ca3f12d0c209077ed0636324b2de692a6051be5b5a68b7cde21f30e485a not found: ID does not exist" Sep 30 08:56:53 crc kubenswrapper[4760]: I0930 08:56:53.084684 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f8c0b7d-249f-4e49-b785-80fa2c617fe1" path="/var/lib/kubelet/pods/0f8c0b7d-249f-4e49-b785-80fa2c617fe1/volumes" Sep 30 08:56:59 crc kubenswrapper[4760]: I0930 08:56:59.066619 4760 scope.go:117] "RemoveContainer" containerID="bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc" Sep 30 08:56:59 crc kubenswrapper[4760]: E0930 08:56:59.067388 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:57:12 crc kubenswrapper[4760]: I0930 08:57:12.067143 4760 scope.go:117] "RemoveContainer" containerID="bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc" Sep 30 08:57:12 crc kubenswrapper[4760]: E0930 08:57:12.068489 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:57:27 crc kubenswrapper[4760]: I0930 08:57:27.067155 4760 scope.go:117] "RemoveContainer" containerID="bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc" Sep 30 08:57:27 crc kubenswrapper[4760]: E0930 08:57:27.068414 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:57:40 crc kubenswrapper[4760]: I0930 08:57:40.067644 4760 scope.go:117] "RemoveContainer" containerID="bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc" Sep 30 08:57:40 crc kubenswrapper[4760]: E0930 08:57:40.070196 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:57:54 crc kubenswrapper[4760]: I0930 08:57:54.899493 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d27d6"] Sep 30 08:57:54 crc kubenswrapper[4760]: E0930 08:57:54.900817 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f8c0b7d-249f-4e49-b785-80fa2c617fe1" containerName="extract-utilities" Sep 30 08:57:54 crc kubenswrapper[4760]: I0930 08:57:54.900842 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f8c0b7d-249f-4e49-b785-80fa2c617fe1" containerName="extract-utilities" Sep 30 08:57:54 crc kubenswrapper[4760]: E0930 08:57:54.900894 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f8c0b7d-249f-4e49-b785-80fa2c617fe1" containerName="registry-server" Sep 30 08:57:54 crc kubenswrapper[4760]: I0930 08:57:54.900908 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f8c0b7d-249f-4e49-b785-80fa2c617fe1" containerName="registry-server" Sep 30 08:57:54 crc kubenswrapper[4760]: E0930 08:57:54.900946 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f8c0b7d-249f-4e49-b785-80fa2c617fe1" containerName="extract-content" Sep 30 08:57:54 crc kubenswrapper[4760]: I0930 08:57:54.900958 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f8c0b7d-249f-4e49-b785-80fa2c617fe1" containerName="extract-content" Sep 30 08:57:54 crc kubenswrapper[4760]: I0930 08:57:54.901341 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f8c0b7d-249f-4e49-b785-80fa2c617fe1" containerName="registry-server" Sep 30 08:57:54 crc kubenswrapper[4760]: I0930 08:57:54.904480 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d27d6" Sep 30 08:57:54 crc kubenswrapper[4760]: I0930 08:57:54.922431 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d27d6"] Sep 30 08:57:55 crc kubenswrapper[4760]: I0930 08:57:55.062353 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgvzx\" (UniqueName: \"kubernetes.io/projected/e1556a38-e689-4683-a116-f548b33b4082-kube-api-access-vgvzx\") pod \"redhat-marketplace-d27d6\" (UID: \"e1556a38-e689-4683-a116-f548b33b4082\") " pod="openshift-marketplace/redhat-marketplace-d27d6" Sep 30 08:57:55 crc kubenswrapper[4760]: I0930 08:57:55.062441 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1556a38-e689-4683-a116-f548b33b4082-utilities\") pod \"redhat-marketplace-d27d6\" (UID: \"e1556a38-e689-4683-a116-f548b33b4082\") " pod="openshift-marketplace/redhat-marketplace-d27d6" Sep 30 08:57:55 crc kubenswrapper[4760]: I0930 08:57:55.062505 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1556a38-e689-4683-a116-f548b33b4082-catalog-content\") pod \"redhat-marketplace-d27d6\" (UID: \"e1556a38-e689-4683-a116-f548b33b4082\") " pod="openshift-marketplace/redhat-marketplace-d27d6" Sep 30 08:57:55 crc kubenswrapper[4760]: I0930 08:57:55.077362 4760 scope.go:117] "RemoveContainer" containerID="bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc" Sep 30 08:57:55 crc kubenswrapper[4760]: E0930 08:57:55.077707 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:57:55 crc kubenswrapper[4760]: I0930 08:57:55.163733 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgvzx\" (UniqueName: \"kubernetes.io/projected/e1556a38-e689-4683-a116-f548b33b4082-kube-api-access-vgvzx\") pod \"redhat-marketplace-d27d6\" (UID: \"e1556a38-e689-4683-a116-f548b33b4082\") " pod="openshift-marketplace/redhat-marketplace-d27d6" Sep 30 08:57:55 crc kubenswrapper[4760]: I0930 08:57:55.163839 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1556a38-e689-4683-a116-f548b33b4082-utilities\") pod \"redhat-marketplace-d27d6\" (UID: \"e1556a38-e689-4683-a116-f548b33b4082\") " pod="openshift-marketplace/redhat-marketplace-d27d6" Sep 30 08:57:55 crc kubenswrapper[4760]: I0930 08:57:55.163929 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1556a38-e689-4683-a116-f548b33b4082-catalog-content\") pod \"redhat-marketplace-d27d6\" (UID: \"e1556a38-e689-4683-a116-f548b33b4082\") " pod="openshift-marketplace/redhat-marketplace-d27d6" Sep 30 08:57:55 crc kubenswrapper[4760]: I0930 08:57:55.164448 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1556a38-e689-4683-a116-f548b33b4082-utilities\") pod \"redhat-marketplace-d27d6\" (UID: \"e1556a38-e689-4683-a116-f548b33b4082\") " pod="openshift-marketplace/redhat-marketplace-d27d6" Sep 30 08:57:55 crc kubenswrapper[4760]: I0930 08:57:55.164703 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1556a38-e689-4683-a116-f548b33b4082-catalog-content\") pod \"redhat-marketplace-d27d6\" (UID: \"e1556a38-e689-4683-a116-f548b33b4082\") " pod="openshift-marketplace/redhat-marketplace-d27d6" Sep 30 08:57:55 crc kubenswrapper[4760]: I0930 08:57:55.976190 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgvzx\" (UniqueName: \"kubernetes.io/projected/e1556a38-e689-4683-a116-f548b33b4082-kube-api-access-vgvzx\") pod \"redhat-marketplace-d27d6\" (UID: \"e1556a38-e689-4683-a116-f548b33b4082\") " pod="openshift-marketplace/redhat-marketplace-d27d6" Sep 30 08:57:56 crc kubenswrapper[4760]: I0930 08:57:56.143452 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d27d6" Sep 30 08:57:56 crc kubenswrapper[4760]: I0930 08:57:56.686902 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d27d6"] Sep 30 08:57:57 crc kubenswrapper[4760]: I0930 08:57:57.114820 4760 generic.go:334] "Generic (PLEG): container finished" podID="e1556a38-e689-4683-a116-f548b33b4082" containerID="64ad34c40b2f00cad29d209f4123c735e37d43c535ef5a4718395776c0da7408" exitCode=0 Sep 30 08:57:57 crc kubenswrapper[4760]: I0930 08:57:57.115182 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d27d6" event={"ID":"e1556a38-e689-4683-a116-f548b33b4082","Type":"ContainerDied","Data":"64ad34c40b2f00cad29d209f4123c735e37d43c535ef5a4718395776c0da7408"} Sep 30 08:57:57 crc kubenswrapper[4760]: I0930 08:57:57.115614 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d27d6" event={"ID":"e1556a38-e689-4683-a116-f548b33b4082","Type":"ContainerStarted","Data":"3f33e16aee970ec3a9a32198d5f6cadbd72babee51cc78df7ed26023ab47924c"} Sep 30 08:57:57 crc kubenswrapper[4760]: I0930 08:57:57.120748 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 08:57:58 crc kubenswrapper[4760]: I0930 08:57:58.127535 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d27d6" event={"ID":"e1556a38-e689-4683-a116-f548b33b4082","Type":"ContainerStarted","Data":"aa81d998a1bab8a537e855bcb83378e9f90a01ba6e3957e107793c5667cc68d4"} Sep 30 08:57:59 crc kubenswrapper[4760]: I0930 08:57:59.143658 4760 generic.go:334] "Generic (PLEG): container finished" podID="e1556a38-e689-4683-a116-f548b33b4082" containerID="aa81d998a1bab8a537e855bcb83378e9f90a01ba6e3957e107793c5667cc68d4" exitCode=0 Sep 30 08:57:59 crc kubenswrapper[4760]: I0930 08:57:59.143900 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d27d6" event={"ID":"e1556a38-e689-4683-a116-f548b33b4082","Type":"ContainerDied","Data":"aa81d998a1bab8a537e855bcb83378e9f90a01ba6e3957e107793c5667cc68d4"} Sep 30 08:58:00 crc kubenswrapper[4760]: I0930 08:58:00.156597 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d27d6" event={"ID":"e1556a38-e689-4683-a116-f548b33b4082","Type":"ContainerStarted","Data":"6d18c20865a285a3f83707c3e4d3b0620ae0bf0dab7a786c938025f51b4aab9d"} Sep 30 08:58:00 crc kubenswrapper[4760]: I0930 08:58:00.200390 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d27d6" podStartSLOduration=3.748362017 podStartE2EDuration="6.200366701s" podCreationTimestamp="2025-09-30 08:57:54 +0000 UTC" firstStartedPulling="2025-09-30 08:57:57.120461113 +0000 UTC m=+5062.763367535" lastFinishedPulling="2025-09-30 08:57:59.572465797 +0000 UTC m=+5065.215372219" observedRunningTime="2025-09-30 08:58:00.188283363 +0000 UTC m=+5065.831189805" watchObservedRunningTime="2025-09-30 08:58:00.200366701 +0000 UTC m=+5065.843273123" Sep 30 08:58:06 crc kubenswrapper[4760]: I0930 08:58:06.144012 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d27d6" Sep 30 08:58:06 crc kubenswrapper[4760]: I0930 08:58:06.145035 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d27d6" Sep 30 08:58:06 crc kubenswrapper[4760]: I0930 08:58:06.268041 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d27d6" Sep 30 08:58:07 crc kubenswrapper[4760]: I0930 08:58:07.067046 4760 scope.go:117] "RemoveContainer" containerID="bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc" Sep 30 08:58:07 crc kubenswrapper[4760]: E0930 08:58:07.067656 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:58:07 crc kubenswrapper[4760]: I0930 08:58:07.340243 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d27d6" Sep 30 08:58:07 crc kubenswrapper[4760]: I0930 08:58:07.391248 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d27d6"] Sep 30 08:58:09 crc kubenswrapper[4760]: I0930 08:58:09.296339 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d27d6" podUID="e1556a38-e689-4683-a116-f548b33b4082" containerName="registry-server" containerID="cri-o://6d18c20865a285a3f83707c3e4d3b0620ae0bf0dab7a786c938025f51b4aab9d" gracePeriod=2 Sep 30 08:58:10 crc kubenswrapper[4760]: I0930 08:58:10.202013 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d27d6" Sep 30 08:58:10 crc kubenswrapper[4760]: I0930 08:58:10.301243 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgvzx\" (UniqueName: \"kubernetes.io/projected/e1556a38-e689-4683-a116-f548b33b4082-kube-api-access-vgvzx\") pod \"e1556a38-e689-4683-a116-f548b33b4082\" (UID: \"e1556a38-e689-4683-a116-f548b33b4082\") " Sep 30 08:58:10 crc kubenswrapper[4760]: I0930 08:58:10.302448 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1556a38-e689-4683-a116-f548b33b4082-utilities\") pod \"e1556a38-e689-4683-a116-f548b33b4082\" (UID: \"e1556a38-e689-4683-a116-f548b33b4082\") " Sep 30 08:58:10 crc kubenswrapper[4760]: I0930 08:58:10.302552 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1556a38-e689-4683-a116-f548b33b4082-catalog-content\") pod \"e1556a38-e689-4683-a116-f548b33b4082\" (UID: \"e1556a38-e689-4683-a116-f548b33b4082\") " Sep 30 08:58:10 crc kubenswrapper[4760]: I0930 08:58:10.303829 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1556a38-e689-4683-a116-f548b33b4082-utilities" (OuterVolumeSpecName: "utilities") pod "e1556a38-e689-4683-a116-f548b33b4082" (UID: "e1556a38-e689-4683-a116-f548b33b4082"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:58:10 crc kubenswrapper[4760]: I0930 08:58:10.309904 4760 generic.go:334] "Generic (PLEG): container finished" podID="e1556a38-e689-4683-a116-f548b33b4082" containerID="6d18c20865a285a3f83707c3e4d3b0620ae0bf0dab7a786c938025f51b4aab9d" exitCode=0 Sep 30 08:58:10 crc kubenswrapper[4760]: I0930 08:58:10.309945 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d27d6" event={"ID":"e1556a38-e689-4683-a116-f548b33b4082","Type":"ContainerDied","Data":"6d18c20865a285a3f83707c3e4d3b0620ae0bf0dab7a786c938025f51b4aab9d"} Sep 30 08:58:10 crc kubenswrapper[4760]: I0930 08:58:10.309981 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d27d6" event={"ID":"e1556a38-e689-4683-a116-f548b33b4082","Type":"ContainerDied","Data":"3f33e16aee970ec3a9a32198d5f6cadbd72babee51cc78df7ed26023ab47924c"} Sep 30 08:58:10 crc kubenswrapper[4760]: I0930 08:58:10.310003 4760 scope.go:117] "RemoveContainer" containerID="6d18c20865a285a3f83707c3e4d3b0620ae0bf0dab7a786c938025f51b4aab9d" Sep 30 08:58:10 crc kubenswrapper[4760]: I0930 08:58:10.310128 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d27d6" Sep 30 08:58:10 crc kubenswrapper[4760]: I0930 08:58:10.310485 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1556a38-e689-4683-a116-f548b33b4082-kube-api-access-vgvzx" (OuterVolumeSpecName: "kube-api-access-vgvzx") pod "e1556a38-e689-4683-a116-f548b33b4082" (UID: "e1556a38-e689-4683-a116-f548b33b4082"). InnerVolumeSpecName "kube-api-access-vgvzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:58:10 crc kubenswrapper[4760]: I0930 08:58:10.322140 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1556a38-e689-4683-a116-f548b33b4082-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1556a38-e689-4683-a116-f548b33b4082" (UID: "e1556a38-e689-4683-a116-f548b33b4082"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:58:10 crc kubenswrapper[4760]: I0930 08:58:10.358669 4760 scope.go:117] "RemoveContainer" containerID="aa81d998a1bab8a537e855bcb83378e9f90a01ba6e3957e107793c5667cc68d4" Sep 30 08:58:10 crc kubenswrapper[4760]: I0930 08:58:10.375035 4760 scope.go:117] "RemoveContainer" containerID="64ad34c40b2f00cad29d209f4123c735e37d43c535ef5a4718395776c0da7408" Sep 30 08:58:10 crc kubenswrapper[4760]: I0930 08:58:10.405537 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1556a38-e689-4683-a116-f548b33b4082-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 08:58:10 crc kubenswrapper[4760]: I0930 08:58:10.405579 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgvzx\" (UniqueName: \"kubernetes.io/projected/e1556a38-e689-4683-a116-f548b33b4082-kube-api-access-vgvzx\") on node \"crc\" DevicePath \"\"" Sep 30 08:58:10 crc kubenswrapper[4760]: I0930 08:58:10.405591 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1556a38-e689-4683-a116-f548b33b4082-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 08:58:10 crc kubenswrapper[4760]: I0930 08:58:10.422459 4760 scope.go:117] "RemoveContainer" containerID="6d18c20865a285a3f83707c3e4d3b0620ae0bf0dab7a786c938025f51b4aab9d" Sep 30 08:58:10 crc kubenswrapper[4760]: E0930 08:58:10.422970 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d18c20865a285a3f83707c3e4d3b0620ae0bf0dab7a786c938025f51b4aab9d\": container with ID starting with 6d18c20865a285a3f83707c3e4d3b0620ae0bf0dab7a786c938025f51b4aab9d not found: ID does not exist" containerID="6d18c20865a285a3f83707c3e4d3b0620ae0bf0dab7a786c938025f51b4aab9d" Sep 30 08:58:10 crc kubenswrapper[4760]: I0930 08:58:10.423058 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d18c20865a285a3f83707c3e4d3b0620ae0bf0dab7a786c938025f51b4aab9d"} err="failed to get container status \"6d18c20865a285a3f83707c3e4d3b0620ae0bf0dab7a786c938025f51b4aab9d\": rpc error: code = NotFound desc = could not find container \"6d18c20865a285a3f83707c3e4d3b0620ae0bf0dab7a786c938025f51b4aab9d\": container with ID starting with 6d18c20865a285a3f83707c3e4d3b0620ae0bf0dab7a786c938025f51b4aab9d not found: ID does not exist" Sep 30 08:58:10 crc kubenswrapper[4760]: I0930 08:58:10.423127 4760 scope.go:117] "RemoveContainer" containerID="aa81d998a1bab8a537e855bcb83378e9f90a01ba6e3957e107793c5667cc68d4" Sep 30 08:58:10 crc kubenswrapper[4760]: E0930 08:58:10.423625 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa81d998a1bab8a537e855bcb83378e9f90a01ba6e3957e107793c5667cc68d4\": container with ID starting with aa81d998a1bab8a537e855bcb83378e9f90a01ba6e3957e107793c5667cc68d4 not found: ID does not exist" containerID="aa81d998a1bab8a537e855bcb83378e9f90a01ba6e3957e107793c5667cc68d4" Sep 30 08:58:10 crc kubenswrapper[4760]: I0930 08:58:10.423671 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa81d998a1bab8a537e855bcb83378e9f90a01ba6e3957e107793c5667cc68d4"} err="failed to get container status \"aa81d998a1bab8a537e855bcb83378e9f90a01ba6e3957e107793c5667cc68d4\": rpc error: code = NotFound desc = could not find container \"aa81d998a1bab8a537e855bcb83378e9f90a01ba6e3957e107793c5667cc68d4\": container with ID starting with aa81d998a1bab8a537e855bcb83378e9f90a01ba6e3957e107793c5667cc68d4 not found: ID does not exist" Sep 30 08:58:10 crc kubenswrapper[4760]: I0930 08:58:10.423698 4760 scope.go:117] "RemoveContainer" containerID="64ad34c40b2f00cad29d209f4123c735e37d43c535ef5a4718395776c0da7408" Sep 30 08:58:10 crc kubenswrapper[4760]: E0930 08:58:10.423940 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64ad34c40b2f00cad29d209f4123c735e37d43c535ef5a4718395776c0da7408\": container with ID starting with 64ad34c40b2f00cad29d209f4123c735e37d43c535ef5a4718395776c0da7408 not found: ID does not exist" containerID="64ad34c40b2f00cad29d209f4123c735e37d43c535ef5a4718395776c0da7408" Sep 30 08:58:10 crc kubenswrapper[4760]: I0930 08:58:10.423964 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64ad34c40b2f00cad29d209f4123c735e37d43c535ef5a4718395776c0da7408"} err="failed to get container status \"64ad34c40b2f00cad29d209f4123c735e37d43c535ef5a4718395776c0da7408\": rpc error: code = NotFound desc = could not find container \"64ad34c40b2f00cad29d209f4123c735e37d43c535ef5a4718395776c0da7408\": container with ID starting with 64ad34c40b2f00cad29d209f4123c735e37d43c535ef5a4718395776c0da7408 not found: ID does not exist" Sep 30 08:58:10 crc kubenswrapper[4760]: I0930 08:58:10.641784 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d27d6"] Sep 30 08:58:10 crc kubenswrapper[4760]: I0930 08:58:10.671409 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d27d6"] Sep 30 08:58:11 crc kubenswrapper[4760]: I0930 08:58:11.086820 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1556a38-e689-4683-a116-f548b33b4082" path="/var/lib/kubelet/pods/e1556a38-e689-4683-a116-f548b33b4082/volumes" Sep 30 08:58:22 crc kubenswrapper[4760]: I0930 08:58:22.068111 4760 scope.go:117] "RemoveContainer" containerID="bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc" Sep 30 08:58:22 crc kubenswrapper[4760]: E0930 08:58:22.069236 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:58:36 crc kubenswrapper[4760]: I0930 08:58:36.067125 4760 scope.go:117] "RemoveContainer" containerID="bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc" Sep 30 08:58:36 crc kubenswrapper[4760]: E0930 08:58:36.068011 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:58:47 crc kubenswrapper[4760]: I0930 08:58:47.068228 4760 scope.go:117] "RemoveContainer" containerID="bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc" Sep 30 08:58:47 crc kubenswrapper[4760]: E0930 08:58:47.069212 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:58:58 crc kubenswrapper[4760]: I0930 08:58:58.066954 4760 scope.go:117] "RemoveContainer" containerID="bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc" Sep 30 08:58:58 crc kubenswrapper[4760]: E0930 08:58:58.067710 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:59:10 crc kubenswrapper[4760]: I0930 08:59:10.066517 4760 scope.go:117] "RemoveContainer" containerID="bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc" Sep 30 08:59:10 crc kubenswrapper[4760]: E0930 08:59:10.067279 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:59:22 crc kubenswrapper[4760]: I0930 08:59:22.067704 4760 scope.go:117] "RemoveContainer" containerID="bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc" Sep 30 08:59:22 crc kubenswrapper[4760]: E0930 08:59:22.068516 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:59:34 crc kubenswrapper[4760]: I0930 08:59:34.068691 4760 scope.go:117] "RemoveContainer" containerID="bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc" Sep 30 08:59:34 crc kubenswrapper[4760]: E0930 08:59:34.069845 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 08:59:47 crc kubenswrapper[4760]: I0930 08:59:47.067479 4760 scope.go:117] "RemoveContainer" containerID="bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc" Sep 30 08:59:47 crc kubenswrapper[4760]: E0930 08:59:47.068781 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:00:00 crc kubenswrapper[4760]: I0930 09:00:00.155145 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320380-wqzfk"] Sep 30 09:00:00 crc kubenswrapper[4760]: E0930 09:00:00.156131 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1556a38-e689-4683-a116-f548b33b4082" containerName="registry-server" Sep 30 09:00:00 crc kubenswrapper[4760]: I0930 09:00:00.156163 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1556a38-e689-4683-a116-f548b33b4082" containerName="registry-server" Sep 30 09:00:00 crc kubenswrapper[4760]: E0930 09:00:00.156176 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1556a38-e689-4683-a116-f548b33b4082" containerName="extract-content" Sep 30 09:00:00 crc kubenswrapper[4760]: I0930 09:00:00.156182 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1556a38-e689-4683-a116-f548b33b4082" containerName="extract-content" Sep 30 09:00:00 crc kubenswrapper[4760]: E0930 09:00:00.156199 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1556a38-e689-4683-a116-f548b33b4082" containerName="extract-utilities" Sep 30 09:00:00 crc kubenswrapper[4760]: I0930 09:00:00.156207 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1556a38-e689-4683-a116-f548b33b4082" containerName="extract-utilities" Sep 30 09:00:00 crc kubenswrapper[4760]: I0930 09:00:00.158078 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1556a38-e689-4683-a116-f548b33b4082" containerName="registry-server" Sep 30 09:00:00 crc kubenswrapper[4760]: I0930 09:00:00.159091 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320380-wqzfk" Sep 30 09:00:00 crc kubenswrapper[4760]: I0930 09:00:00.161179 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 09:00:00 crc kubenswrapper[4760]: I0930 09:00:00.161338 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 09:00:00 crc kubenswrapper[4760]: I0930 09:00:00.168530 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320380-wqzfk"] Sep 30 09:00:00 crc kubenswrapper[4760]: I0930 09:00:00.355434 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4-secret-volume\") pod \"collect-profiles-29320380-wqzfk\" (UID: \"53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320380-wqzfk" Sep 30 09:00:00 crc kubenswrapper[4760]: I0930 09:00:00.355637 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68tgw\" (UniqueName: \"kubernetes.io/projected/53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4-kube-api-access-68tgw\") pod \"collect-profiles-29320380-wqzfk\" (UID: \"53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320380-wqzfk" Sep 30 09:00:00 crc kubenswrapper[4760]: I0930 09:00:00.355687 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4-config-volume\") pod \"collect-profiles-29320380-wqzfk\" (UID: \"53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320380-wqzfk" Sep 30 09:00:00 crc kubenswrapper[4760]: I0930 09:00:00.457419 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4-secret-volume\") pod \"collect-profiles-29320380-wqzfk\" (UID: \"53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320380-wqzfk" Sep 30 09:00:00 crc kubenswrapper[4760]: I0930 09:00:00.457557 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68tgw\" (UniqueName: \"kubernetes.io/projected/53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4-kube-api-access-68tgw\") pod \"collect-profiles-29320380-wqzfk\" (UID: \"53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320380-wqzfk" Sep 30 09:00:00 crc kubenswrapper[4760]: I0930 09:00:00.457616 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4-config-volume\") pod \"collect-profiles-29320380-wqzfk\" (UID: \"53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320380-wqzfk" Sep 30 09:00:00 crc kubenswrapper[4760]: I0930 09:00:00.458657 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4-config-volume\") pod \"collect-profiles-29320380-wqzfk\" (UID: \"53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320380-wqzfk" Sep 30 09:00:00 crc kubenswrapper[4760]: I0930 09:00:00.466699 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4-secret-volume\") pod \"collect-profiles-29320380-wqzfk\" (UID: \"53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320380-wqzfk" Sep 30 09:00:00 crc kubenswrapper[4760]: I0930 09:00:00.483186 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68tgw\" (UniqueName: \"kubernetes.io/projected/53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4-kube-api-access-68tgw\") pod \"collect-profiles-29320380-wqzfk\" (UID: \"53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320380-wqzfk" Sep 30 09:00:00 crc kubenswrapper[4760]: I0930 09:00:00.492152 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320380-wqzfk" Sep 30 09:00:01 crc kubenswrapper[4760]: I0930 09:00:01.011153 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320380-wqzfk"] Sep 30 09:00:01 crc kubenswrapper[4760]: I0930 09:00:01.582138 4760 generic.go:334] "Generic (PLEG): container finished" podID="53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4" containerID="a6a8537ff4390e90f72b670cbf69ef7c2c98a9fa84249bb1f46b3a132e0b1b96" exitCode=0 Sep 30 09:00:01 crc kubenswrapper[4760]: I0930 09:00:01.582243 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320380-wqzfk" event={"ID":"53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4","Type":"ContainerDied","Data":"a6a8537ff4390e90f72b670cbf69ef7c2c98a9fa84249bb1f46b3a132e0b1b96"} Sep 30 09:00:01 crc kubenswrapper[4760]: I0930 09:00:01.582529 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320380-wqzfk" event={"ID":"53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4","Type":"ContainerStarted","Data":"f7f91a374c970cadaf26fad2203f7d63d7493c6d6818dd46fe15afa16b3cc059"} Sep 30 09:00:02 crc kubenswrapper[4760]: I0930 09:00:02.067830 4760 scope.go:117] "RemoveContainer" containerID="bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc" Sep 30 09:00:02 crc kubenswrapper[4760]: E0930 09:00:02.068398 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:00:02 crc kubenswrapper[4760]: I0930 09:00:02.976554 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320380-wqzfk" Sep 30 09:00:03 crc kubenswrapper[4760]: I0930 09:00:03.023004 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68tgw\" (UniqueName: \"kubernetes.io/projected/53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4-kube-api-access-68tgw\") pod \"53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4\" (UID: \"53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4\") " Sep 30 09:00:03 crc kubenswrapper[4760]: I0930 09:00:03.023088 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4-config-volume\") pod \"53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4\" (UID: \"53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4\") " Sep 30 09:00:03 crc kubenswrapper[4760]: I0930 09:00:03.023189 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4-secret-volume\") pod \"53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4\" (UID: \"53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4\") " Sep 30 09:00:03 crc kubenswrapper[4760]: I0930 09:00:03.023828 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4-config-volume" (OuterVolumeSpecName: "config-volume") pod "53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4" (UID: "53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 09:00:03 crc kubenswrapper[4760]: I0930 09:00:03.032065 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4-kube-api-access-68tgw" (OuterVolumeSpecName: "kube-api-access-68tgw") pod "53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4" (UID: "53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4"). InnerVolumeSpecName "kube-api-access-68tgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 09:00:03 crc kubenswrapper[4760]: I0930 09:00:03.032548 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4" (UID: "53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 09:00:03 crc kubenswrapper[4760]: I0930 09:00:03.125640 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68tgw\" (UniqueName: \"kubernetes.io/projected/53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4-kube-api-access-68tgw\") on node \"crc\" DevicePath \"\"" Sep 30 09:00:03 crc kubenswrapper[4760]: I0930 09:00:03.125673 4760 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 09:00:03 crc kubenswrapper[4760]: I0930 09:00:03.125685 4760 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 09:00:03 crc kubenswrapper[4760]: I0930 09:00:03.602664 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320380-wqzfk" event={"ID":"53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4","Type":"ContainerDied","Data":"f7f91a374c970cadaf26fad2203f7d63d7493c6d6818dd46fe15afa16b3cc059"} Sep 30 09:00:03 crc kubenswrapper[4760]: I0930 09:00:03.602922 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7f91a374c970cadaf26fad2203f7d63d7493c6d6818dd46fe15afa16b3cc059" Sep 30 09:00:03 crc kubenswrapper[4760]: I0930 09:00:03.602733 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320380-wqzfk" Sep 30 09:00:04 crc kubenswrapper[4760]: I0930 09:00:04.081585 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320335-zr8b5"] Sep 30 09:00:04 crc kubenswrapper[4760]: I0930 09:00:04.099549 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320335-zr8b5"] Sep 30 09:00:05 crc kubenswrapper[4760]: I0930 09:00:05.095840 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6e89699-0bf0-4749-b268-967ef8499eae" path="/var/lib/kubelet/pods/d6e89699-0bf0-4749-b268-967ef8499eae/volumes" Sep 30 09:00:14 crc kubenswrapper[4760]: I0930 09:00:14.067115 4760 scope.go:117] "RemoveContainer" containerID="bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc" Sep 30 09:00:14 crc kubenswrapper[4760]: E0930 09:00:14.068091 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:00:27 crc kubenswrapper[4760]: I0930 09:00:27.067029 4760 scope.go:117] "RemoveContainer" containerID="bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc" Sep 30 09:00:27 crc kubenswrapper[4760]: E0930 09:00:27.067919 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:00:41 crc kubenswrapper[4760]: I0930 09:00:41.067917 4760 scope.go:117] "RemoveContainer" containerID="bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc" Sep 30 09:00:41 crc kubenswrapper[4760]: E0930 09:00:41.069001 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:00:47 crc kubenswrapper[4760]: I0930 09:00:47.715776 4760 scope.go:117] "RemoveContainer" containerID="5b05ae1dbf71a64d4e9167a735f528a75f6de5af266832ccf05df8077d8891ae" Sep 30 09:00:54 crc kubenswrapper[4760]: I0930 09:00:54.067998 4760 scope.go:117] "RemoveContainer" containerID="bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc" Sep 30 09:00:54 crc kubenswrapper[4760]: E0930 09:00:54.069527 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:01:00 crc kubenswrapper[4760]: I0930 09:01:00.163013 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29320381-pcwcm"] Sep 30 09:01:00 crc kubenswrapper[4760]: E0930 09:01:00.163965 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4" containerName="collect-profiles" Sep 30 09:01:00 crc kubenswrapper[4760]: I0930 09:01:00.163982 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4" containerName="collect-profiles" Sep 30 09:01:00 crc kubenswrapper[4760]: I0930 09:01:00.164268 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="53b3c31d-bfe6-47f2-a7b0-f1ef23b548d4" containerName="collect-profiles" Sep 30 09:01:00 crc kubenswrapper[4760]: I0930 09:01:00.165123 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320381-pcwcm" Sep 30 09:01:00 crc kubenswrapper[4760]: I0930 09:01:00.192707 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320381-pcwcm"] Sep 30 09:01:00 crc kubenswrapper[4760]: I0930 09:01:00.270610 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa755a39-f3ae-49a4-80ce-a0efcfe2566e-combined-ca-bundle\") pod \"keystone-cron-29320381-pcwcm\" (UID: \"aa755a39-f3ae-49a4-80ce-a0efcfe2566e\") " pod="openstack/keystone-cron-29320381-pcwcm" Sep 30 09:01:00 crc kubenswrapper[4760]: I0930 09:01:00.270717 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa755a39-f3ae-49a4-80ce-a0efcfe2566e-config-data\") pod \"keystone-cron-29320381-pcwcm\" (UID: \"aa755a39-f3ae-49a4-80ce-a0efcfe2566e\") " pod="openstack/keystone-cron-29320381-pcwcm" Sep 30 09:01:00 crc kubenswrapper[4760]: I0930 09:01:00.270781 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aa755a39-f3ae-49a4-80ce-a0efcfe2566e-fernet-keys\") pod \"keystone-cron-29320381-pcwcm\" (UID: \"aa755a39-f3ae-49a4-80ce-a0efcfe2566e\") " pod="openstack/keystone-cron-29320381-pcwcm" Sep 30 09:01:00 crc kubenswrapper[4760]: I0930 09:01:00.270807 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v24p\" (UniqueName: \"kubernetes.io/projected/aa755a39-f3ae-49a4-80ce-a0efcfe2566e-kube-api-access-7v24p\") pod \"keystone-cron-29320381-pcwcm\" (UID: \"aa755a39-f3ae-49a4-80ce-a0efcfe2566e\") " pod="openstack/keystone-cron-29320381-pcwcm" Sep 30 09:01:00 crc kubenswrapper[4760]: I0930 09:01:00.372539 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa755a39-f3ae-49a4-80ce-a0efcfe2566e-config-data\") pod \"keystone-cron-29320381-pcwcm\" (UID: \"aa755a39-f3ae-49a4-80ce-a0efcfe2566e\") " pod="openstack/keystone-cron-29320381-pcwcm" Sep 30 09:01:00 crc kubenswrapper[4760]: I0930 09:01:00.372651 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aa755a39-f3ae-49a4-80ce-a0efcfe2566e-fernet-keys\") pod \"keystone-cron-29320381-pcwcm\" (UID: \"aa755a39-f3ae-49a4-80ce-a0efcfe2566e\") " pod="openstack/keystone-cron-29320381-pcwcm" Sep 30 09:01:00 crc kubenswrapper[4760]: I0930 09:01:00.372682 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v24p\" (UniqueName: \"kubernetes.io/projected/aa755a39-f3ae-49a4-80ce-a0efcfe2566e-kube-api-access-7v24p\") pod \"keystone-cron-29320381-pcwcm\" (UID: \"aa755a39-f3ae-49a4-80ce-a0efcfe2566e\") " pod="openstack/keystone-cron-29320381-pcwcm" Sep 30 09:01:00 crc kubenswrapper[4760]: I0930 09:01:00.372725 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa755a39-f3ae-49a4-80ce-a0efcfe2566e-combined-ca-bundle\") pod \"keystone-cron-29320381-pcwcm\" (UID: \"aa755a39-f3ae-49a4-80ce-a0efcfe2566e\") " pod="openstack/keystone-cron-29320381-pcwcm" Sep 30 09:01:00 crc kubenswrapper[4760]: I0930 09:01:00.378679 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aa755a39-f3ae-49a4-80ce-a0efcfe2566e-fernet-keys\") pod \"keystone-cron-29320381-pcwcm\" (UID: \"aa755a39-f3ae-49a4-80ce-a0efcfe2566e\") " pod="openstack/keystone-cron-29320381-pcwcm" Sep 30 09:01:00 crc kubenswrapper[4760]: I0930 09:01:00.379483 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa755a39-f3ae-49a4-80ce-a0efcfe2566e-combined-ca-bundle\") pod \"keystone-cron-29320381-pcwcm\" (UID: \"aa755a39-f3ae-49a4-80ce-a0efcfe2566e\") " pod="openstack/keystone-cron-29320381-pcwcm" Sep 30 09:01:00 crc kubenswrapper[4760]: I0930 09:01:00.381141 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa755a39-f3ae-49a4-80ce-a0efcfe2566e-config-data\") pod \"keystone-cron-29320381-pcwcm\" (UID: \"aa755a39-f3ae-49a4-80ce-a0efcfe2566e\") " pod="openstack/keystone-cron-29320381-pcwcm" Sep 30 09:01:00 crc kubenswrapper[4760]: I0930 09:01:00.397321 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v24p\" (UniqueName: \"kubernetes.io/projected/aa755a39-f3ae-49a4-80ce-a0efcfe2566e-kube-api-access-7v24p\") pod \"keystone-cron-29320381-pcwcm\" (UID: \"aa755a39-f3ae-49a4-80ce-a0efcfe2566e\") " pod="openstack/keystone-cron-29320381-pcwcm" Sep 30 09:01:00 crc kubenswrapper[4760]: I0930 09:01:00.503732 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320381-pcwcm" Sep 30 09:01:00 crc kubenswrapper[4760]: I0930 09:01:00.980165 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320381-pcwcm"] Sep 30 09:01:01 crc kubenswrapper[4760]: I0930 09:01:01.272262 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320381-pcwcm" event={"ID":"aa755a39-f3ae-49a4-80ce-a0efcfe2566e","Type":"ContainerStarted","Data":"da3a829483acae1aa7ea11391ce8e6c0a40582ec79faad8d6e77d3cab0fd5863"} Sep 30 09:01:01 crc kubenswrapper[4760]: I0930 09:01:01.272353 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320381-pcwcm" event={"ID":"aa755a39-f3ae-49a4-80ce-a0efcfe2566e","Type":"ContainerStarted","Data":"a0dc18179b87e2a984944b3a6a2fd69d39b110a2abff1bb30cc6302c989a9359"} Sep 30 09:01:01 crc kubenswrapper[4760]: I0930 09:01:01.304420 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29320381-pcwcm" podStartSLOduration=1.304384253 podStartE2EDuration="1.304384253s" podCreationTimestamp="2025-09-30 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 09:01:01.284986309 +0000 UTC m=+5246.927892761" watchObservedRunningTime="2025-09-30 09:01:01.304384253 +0000 UTC m=+5246.947290715" Sep 30 09:01:04 crc kubenswrapper[4760]: I0930 09:01:04.304630 4760 generic.go:334] "Generic (PLEG): container finished" podID="aa755a39-f3ae-49a4-80ce-a0efcfe2566e" containerID="da3a829483acae1aa7ea11391ce8e6c0a40582ec79faad8d6e77d3cab0fd5863" exitCode=0 Sep 30 09:01:04 crc kubenswrapper[4760]: I0930 09:01:04.304718 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320381-pcwcm" event={"ID":"aa755a39-f3ae-49a4-80ce-a0efcfe2566e","Type":"ContainerDied","Data":"da3a829483acae1aa7ea11391ce8e6c0a40582ec79faad8d6e77d3cab0fd5863"} Sep 30 09:01:05 crc kubenswrapper[4760]: I0930 09:01:05.704982 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320381-pcwcm" Sep 30 09:01:05 crc kubenswrapper[4760]: I0930 09:01:05.793033 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v24p\" (UniqueName: \"kubernetes.io/projected/aa755a39-f3ae-49a4-80ce-a0efcfe2566e-kube-api-access-7v24p\") pod \"aa755a39-f3ae-49a4-80ce-a0efcfe2566e\" (UID: \"aa755a39-f3ae-49a4-80ce-a0efcfe2566e\") " Sep 30 09:01:05 crc kubenswrapper[4760]: I0930 09:01:05.793332 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa755a39-f3ae-49a4-80ce-a0efcfe2566e-config-data\") pod \"aa755a39-f3ae-49a4-80ce-a0efcfe2566e\" (UID: \"aa755a39-f3ae-49a4-80ce-a0efcfe2566e\") " Sep 30 09:01:05 crc kubenswrapper[4760]: I0930 09:01:05.793411 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aa755a39-f3ae-49a4-80ce-a0efcfe2566e-fernet-keys\") pod \"aa755a39-f3ae-49a4-80ce-a0efcfe2566e\" (UID: \"aa755a39-f3ae-49a4-80ce-a0efcfe2566e\") " Sep 30 09:01:05 crc kubenswrapper[4760]: I0930 09:01:05.793564 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa755a39-f3ae-49a4-80ce-a0efcfe2566e-combined-ca-bundle\") pod \"aa755a39-f3ae-49a4-80ce-a0efcfe2566e\" (UID: \"aa755a39-f3ae-49a4-80ce-a0efcfe2566e\") " Sep 30 09:01:05 crc kubenswrapper[4760]: I0930 09:01:05.801999 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa755a39-f3ae-49a4-80ce-a0efcfe2566e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "aa755a39-f3ae-49a4-80ce-a0efcfe2566e" (UID: "aa755a39-f3ae-49a4-80ce-a0efcfe2566e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 09:01:05 crc kubenswrapper[4760]: I0930 09:01:05.803177 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa755a39-f3ae-49a4-80ce-a0efcfe2566e-kube-api-access-7v24p" (OuterVolumeSpecName: "kube-api-access-7v24p") pod "aa755a39-f3ae-49a4-80ce-a0efcfe2566e" (UID: "aa755a39-f3ae-49a4-80ce-a0efcfe2566e"). InnerVolumeSpecName "kube-api-access-7v24p". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 09:01:05 crc kubenswrapper[4760]: I0930 09:01:05.829478 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa755a39-f3ae-49a4-80ce-a0efcfe2566e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa755a39-f3ae-49a4-80ce-a0efcfe2566e" (UID: "aa755a39-f3ae-49a4-80ce-a0efcfe2566e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 09:01:05 crc kubenswrapper[4760]: I0930 09:01:05.848434 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa755a39-f3ae-49a4-80ce-a0efcfe2566e-config-data" (OuterVolumeSpecName: "config-data") pod "aa755a39-f3ae-49a4-80ce-a0efcfe2566e" (UID: "aa755a39-f3ae-49a4-80ce-a0efcfe2566e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 09:01:05 crc kubenswrapper[4760]: I0930 09:01:05.895759 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa755a39-f3ae-49a4-80ce-a0efcfe2566e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 09:01:05 crc kubenswrapper[4760]: I0930 09:01:05.895798 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v24p\" (UniqueName: \"kubernetes.io/projected/aa755a39-f3ae-49a4-80ce-a0efcfe2566e-kube-api-access-7v24p\") on node \"crc\" DevicePath \"\"" Sep 30 09:01:05 crc kubenswrapper[4760]: I0930 09:01:05.895811 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa755a39-f3ae-49a4-80ce-a0efcfe2566e-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 09:01:05 crc kubenswrapper[4760]: I0930 09:01:05.895822 4760 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aa755a39-f3ae-49a4-80ce-a0efcfe2566e-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 09:01:06 crc kubenswrapper[4760]: I0930 09:01:06.327591 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320381-pcwcm" event={"ID":"aa755a39-f3ae-49a4-80ce-a0efcfe2566e","Type":"ContainerDied","Data":"a0dc18179b87e2a984944b3a6a2fd69d39b110a2abff1bb30cc6302c989a9359"} Sep 30 09:01:06 crc kubenswrapper[4760]: I0930 09:01:06.327630 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0dc18179b87e2a984944b3a6a2fd69d39b110a2abff1bb30cc6302c989a9359" Sep 30 09:01:06 crc kubenswrapper[4760]: I0930 09:01:06.327684 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320381-pcwcm" Sep 30 09:01:09 crc kubenswrapper[4760]: I0930 09:01:09.068617 4760 scope.go:117] "RemoveContainer" containerID="bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc" Sep 30 09:01:09 crc kubenswrapper[4760]: E0930 09:01:09.079011 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:01:24 crc kubenswrapper[4760]: I0930 09:01:24.067650 4760 scope.go:117] "RemoveContainer" containerID="bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc" Sep 30 09:01:24 crc kubenswrapper[4760]: I0930 09:01:24.512244 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerStarted","Data":"f0ae7ba7d74c28a6e89df3e826469af812a29ed5c702a83eb2ce6b2043289c4e"} Sep 30 09:02:15 crc kubenswrapper[4760]: I0930 09:02:15.822045 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r2cw8"] Sep 30 09:02:15 crc kubenswrapper[4760]: E0930 09:02:15.826492 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa755a39-f3ae-49a4-80ce-a0efcfe2566e" containerName="keystone-cron" Sep 30 09:02:15 crc kubenswrapper[4760]: I0930 09:02:15.826630 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa755a39-f3ae-49a4-80ce-a0efcfe2566e" containerName="keystone-cron" Sep 30 09:02:15 crc kubenswrapper[4760]: I0930 09:02:15.827012 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa755a39-f3ae-49a4-80ce-a0efcfe2566e" containerName="keystone-cron" Sep 30 09:02:15 crc kubenswrapper[4760]: I0930 09:02:15.829114 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r2cw8" Sep 30 09:02:15 crc kubenswrapper[4760]: I0930 09:02:15.842356 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r2cw8"] Sep 30 09:02:15 crc kubenswrapper[4760]: I0930 09:02:15.890611 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8424j\" (UniqueName: \"kubernetes.io/projected/0a932665-8fe2-4db0-b09e-13e7209b8f14-kube-api-access-8424j\") pod \"community-operators-r2cw8\" (UID: \"0a932665-8fe2-4db0-b09e-13e7209b8f14\") " pod="openshift-marketplace/community-operators-r2cw8" Sep 30 09:02:15 crc kubenswrapper[4760]: I0930 09:02:15.890686 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a932665-8fe2-4db0-b09e-13e7209b8f14-utilities\") pod \"community-operators-r2cw8\" (UID: \"0a932665-8fe2-4db0-b09e-13e7209b8f14\") " pod="openshift-marketplace/community-operators-r2cw8" Sep 30 09:02:15 crc kubenswrapper[4760]: I0930 09:02:15.890825 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a932665-8fe2-4db0-b09e-13e7209b8f14-catalog-content\") pod \"community-operators-r2cw8\" (UID: \"0a932665-8fe2-4db0-b09e-13e7209b8f14\") " pod="openshift-marketplace/community-operators-r2cw8" Sep 30 09:02:15 crc kubenswrapper[4760]: I0930 09:02:15.993202 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a932665-8fe2-4db0-b09e-13e7209b8f14-catalog-content\") pod \"community-operators-r2cw8\" (UID: \"0a932665-8fe2-4db0-b09e-13e7209b8f14\") " pod="openshift-marketplace/community-operators-r2cw8" Sep 30 09:02:15 crc kubenswrapper[4760]: I0930 09:02:15.993473 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8424j\" (UniqueName: \"kubernetes.io/projected/0a932665-8fe2-4db0-b09e-13e7209b8f14-kube-api-access-8424j\") pod \"community-operators-r2cw8\" (UID: \"0a932665-8fe2-4db0-b09e-13e7209b8f14\") " pod="openshift-marketplace/community-operators-r2cw8" Sep 30 09:02:15 crc kubenswrapper[4760]: I0930 09:02:15.993523 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a932665-8fe2-4db0-b09e-13e7209b8f14-utilities\") pod \"community-operators-r2cw8\" (UID: \"0a932665-8fe2-4db0-b09e-13e7209b8f14\") " pod="openshift-marketplace/community-operators-r2cw8" Sep 30 09:02:15 crc kubenswrapper[4760]: I0930 09:02:15.993819 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a932665-8fe2-4db0-b09e-13e7209b8f14-catalog-content\") pod \"community-operators-r2cw8\" (UID: \"0a932665-8fe2-4db0-b09e-13e7209b8f14\") " pod="openshift-marketplace/community-operators-r2cw8" Sep 30 09:02:15 crc kubenswrapper[4760]: I0930 09:02:15.993971 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a932665-8fe2-4db0-b09e-13e7209b8f14-utilities\") pod \"community-operators-r2cw8\" (UID: \"0a932665-8fe2-4db0-b09e-13e7209b8f14\") " pod="openshift-marketplace/community-operators-r2cw8" Sep 30 09:02:16 crc kubenswrapper[4760]: I0930 09:02:16.017618 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8424j\" (UniqueName: \"kubernetes.io/projected/0a932665-8fe2-4db0-b09e-13e7209b8f14-kube-api-access-8424j\") pod \"community-operators-r2cw8\" (UID: \"0a932665-8fe2-4db0-b09e-13e7209b8f14\") " pod="openshift-marketplace/community-operators-r2cw8" Sep 30 09:02:16 crc kubenswrapper[4760]: I0930 09:02:16.160868 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r2cw8" Sep 30 09:02:16 crc kubenswrapper[4760]: I0930 09:02:16.713491 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r2cw8"] Sep 30 09:02:16 crc kubenswrapper[4760]: W0930 09:02:16.716737 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a932665_8fe2_4db0_b09e_13e7209b8f14.slice/crio-c5e1e237818981831a9cb48a0db8e36ea01990870b1217b0f218e566720c44c5 WatchSource:0}: Error finding container c5e1e237818981831a9cb48a0db8e36ea01990870b1217b0f218e566720c44c5: Status 404 returned error can't find the container with id c5e1e237818981831a9cb48a0db8e36ea01990870b1217b0f218e566720c44c5 Sep 30 09:02:17 crc kubenswrapper[4760]: I0930 09:02:17.113479 4760 generic.go:334] "Generic (PLEG): container finished" podID="0a932665-8fe2-4db0-b09e-13e7209b8f14" containerID="81c533b8a162e28792f71cb3b5b2f3763af1625cf381470ce22c6bcd56cdf2d5" exitCode=0 Sep 30 09:02:17 crc kubenswrapper[4760]: I0930 09:02:17.113549 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2cw8" event={"ID":"0a932665-8fe2-4db0-b09e-13e7209b8f14","Type":"ContainerDied","Data":"81c533b8a162e28792f71cb3b5b2f3763af1625cf381470ce22c6bcd56cdf2d5"} Sep 30 09:02:17 crc kubenswrapper[4760]: I0930 09:02:17.113818 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2cw8" event={"ID":"0a932665-8fe2-4db0-b09e-13e7209b8f14","Type":"ContainerStarted","Data":"c5e1e237818981831a9cb48a0db8e36ea01990870b1217b0f218e566720c44c5"} Sep 30 09:02:19 crc kubenswrapper[4760]: I0930 09:02:19.140171 4760 generic.go:334] "Generic (PLEG): container finished" podID="0a932665-8fe2-4db0-b09e-13e7209b8f14" containerID="7698e76a1b9bc0f48344b5bfbe792c6302cf52cca973d06404bc6a5d62a8edcf" exitCode=0 Sep 30 09:02:19 crc kubenswrapper[4760]: I0930 09:02:19.140633 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2cw8" event={"ID":"0a932665-8fe2-4db0-b09e-13e7209b8f14","Type":"ContainerDied","Data":"7698e76a1b9bc0f48344b5bfbe792c6302cf52cca973d06404bc6a5d62a8edcf"} Sep 30 09:02:21 crc kubenswrapper[4760]: I0930 09:02:21.163552 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2cw8" event={"ID":"0a932665-8fe2-4db0-b09e-13e7209b8f14","Type":"ContainerStarted","Data":"ae2661013ebdca3b4be6c4f4bae53f31d32620a03bbfe593accb363283a948a0"} Sep 30 09:02:26 crc kubenswrapper[4760]: I0930 09:02:26.161672 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r2cw8" Sep 30 09:02:26 crc kubenswrapper[4760]: I0930 09:02:26.162400 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r2cw8" Sep 30 09:02:26 crc kubenswrapper[4760]: I0930 09:02:26.243797 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r2cw8" Sep 30 09:02:26 crc kubenswrapper[4760]: I0930 09:02:26.288749 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r2cw8" podStartSLOduration=8.536212532 podStartE2EDuration="11.288721236s" podCreationTimestamp="2025-09-30 09:02:15 +0000 UTC" firstStartedPulling="2025-09-30 09:02:17.116198402 +0000 UTC m=+5322.759104854" lastFinishedPulling="2025-09-30 09:02:19.868707136 +0000 UTC m=+5325.511613558" observedRunningTime="2025-09-30 09:02:21.18613485 +0000 UTC m=+5326.829041272" watchObservedRunningTime="2025-09-30 09:02:26.288721236 +0000 UTC m=+5331.931627658" Sep 30 09:02:26 crc kubenswrapper[4760]: I0930 09:02:26.301435 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r2cw8" Sep 30 09:02:26 crc kubenswrapper[4760]: I0930 09:02:26.498594 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r2cw8"] Sep 30 09:02:28 crc kubenswrapper[4760]: I0930 09:02:28.251871 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r2cw8" podUID="0a932665-8fe2-4db0-b09e-13e7209b8f14" containerName="registry-server" containerID="cri-o://ae2661013ebdca3b4be6c4f4bae53f31d32620a03bbfe593accb363283a948a0" gracePeriod=2 Sep 30 09:02:28 crc kubenswrapper[4760]: I0930 09:02:28.863386 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r2cw8" Sep 30 09:02:28 crc kubenswrapper[4760]: I0930 09:02:28.904197 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8424j\" (UniqueName: \"kubernetes.io/projected/0a932665-8fe2-4db0-b09e-13e7209b8f14-kube-api-access-8424j\") pod \"0a932665-8fe2-4db0-b09e-13e7209b8f14\" (UID: \"0a932665-8fe2-4db0-b09e-13e7209b8f14\") " Sep 30 09:02:28 crc kubenswrapper[4760]: I0930 09:02:28.904362 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a932665-8fe2-4db0-b09e-13e7209b8f14-utilities\") pod \"0a932665-8fe2-4db0-b09e-13e7209b8f14\" (UID: \"0a932665-8fe2-4db0-b09e-13e7209b8f14\") " Sep 30 09:02:28 crc kubenswrapper[4760]: I0930 09:02:28.904542 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a932665-8fe2-4db0-b09e-13e7209b8f14-catalog-content\") pod \"0a932665-8fe2-4db0-b09e-13e7209b8f14\" (UID: \"0a932665-8fe2-4db0-b09e-13e7209b8f14\") " Sep 30 09:02:28 crc kubenswrapper[4760]: I0930 09:02:28.905395 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a932665-8fe2-4db0-b09e-13e7209b8f14-utilities" (OuterVolumeSpecName: "utilities") pod "0a932665-8fe2-4db0-b09e-13e7209b8f14" (UID: "0a932665-8fe2-4db0-b09e-13e7209b8f14"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 09:02:28 crc kubenswrapper[4760]: I0930 09:02:28.911484 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a932665-8fe2-4db0-b09e-13e7209b8f14-kube-api-access-8424j" (OuterVolumeSpecName: "kube-api-access-8424j") pod "0a932665-8fe2-4db0-b09e-13e7209b8f14" (UID: "0a932665-8fe2-4db0-b09e-13e7209b8f14"). InnerVolumeSpecName "kube-api-access-8424j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 09:02:28 crc kubenswrapper[4760]: I0930 09:02:28.957769 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a932665-8fe2-4db0-b09e-13e7209b8f14-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0a932665-8fe2-4db0-b09e-13e7209b8f14" (UID: "0a932665-8fe2-4db0-b09e-13e7209b8f14"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 09:02:29 crc kubenswrapper[4760]: I0930 09:02:29.006212 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a932665-8fe2-4db0-b09e-13e7209b8f14-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 09:02:29 crc kubenswrapper[4760]: I0930 09:02:29.006245 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8424j\" (UniqueName: \"kubernetes.io/projected/0a932665-8fe2-4db0-b09e-13e7209b8f14-kube-api-access-8424j\") on node \"crc\" DevicePath \"\"" Sep 30 09:02:29 crc kubenswrapper[4760]: I0930 09:02:29.006259 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a932665-8fe2-4db0-b09e-13e7209b8f14-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 09:02:29 crc kubenswrapper[4760]: I0930 09:02:29.265965 4760 generic.go:334] "Generic (PLEG): container finished" podID="0a932665-8fe2-4db0-b09e-13e7209b8f14" containerID="ae2661013ebdca3b4be6c4f4bae53f31d32620a03bbfe593accb363283a948a0" exitCode=0 Sep 30 09:02:29 crc kubenswrapper[4760]: I0930 09:02:29.266010 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2cw8" event={"ID":"0a932665-8fe2-4db0-b09e-13e7209b8f14","Type":"ContainerDied","Data":"ae2661013ebdca3b4be6c4f4bae53f31d32620a03bbfe593accb363283a948a0"} Sep 30 09:02:29 crc kubenswrapper[4760]: I0930 09:02:29.266047 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2cw8" event={"ID":"0a932665-8fe2-4db0-b09e-13e7209b8f14","Type":"ContainerDied","Data":"c5e1e237818981831a9cb48a0db8e36ea01990870b1217b0f218e566720c44c5"} Sep 30 09:02:29 crc kubenswrapper[4760]: I0930 09:02:29.266065 4760 scope.go:117] "RemoveContainer" containerID="ae2661013ebdca3b4be6c4f4bae53f31d32620a03bbfe593accb363283a948a0" Sep 30 09:02:29 crc kubenswrapper[4760]: I0930 09:02:29.266058 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r2cw8" Sep 30 09:02:29 crc kubenswrapper[4760]: I0930 09:02:29.297155 4760 scope.go:117] "RemoveContainer" containerID="7698e76a1b9bc0f48344b5bfbe792c6302cf52cca973d06404bc6a5d62a8edcf" Sep 30 09:02:29 crc kubenswrapper[4760]: I0930 09:02:29.309262 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r2cw8"] Sep 30 09:02:29 crc kubenswrapper[4760]: I0930 09:02:29.324665 4760 scope.go:117] "RemoveContainer" containerID="81c533b8a162e28792f71cb3b5b2f3763af1625cf381470ce22c6bcd56cdf2d5" Sep 30 09:02:29 crc kubenswrapper[4760]: I0930 09:02:29.326655 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r2cw8"] Sep 30 09:02:29 crc kubenswrapper[4760]: I0930 09:02:29.381434 4760 scope.go:117] "RemoveContainer" containerID="ae2661013ebdca3b4be6c4f4bae53f31d32620a03bbfe593accb363283a948a0" Sep 30 09:02:29 crc kubenswrapper[4760]: E0930 09:02:29.381980 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae2661013ebdca3b4be6c4f4bae53f31d32620a03bbfe593accb363283a948a0\": container with ID starting with ae2661013ebdca3b4be6c4f4bae53f31d32620a03bbfe593accb363283a948a0 not found: ID does not exist" containerID="ae2661013ebdca3b4be6c4f4bae53f31d32620a03bbfe593accb363283a948a0" Sep 30 09:02:29 crc kubenswrapper[4760]: I0930 09:02:29.382024 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae2661013ebdca3b4be6c4f4bae53f31d32620a03bbfe593accb363283a948a0"} err="failed to get container status \"ae2661013ebdca3b4be6c4f4bae53f31d32620a03bbfe593accb363283a948a0\": rpc error: code = NotFound desc = could not find container \"ae2661013ebdca3b4be6c4f4bae53f31d32620a03bbfe593accb363283a948a0\": container with ID starting with ae2661013ebdca3b4be6c4f4bae53f31d32620a03bbfe593accb363283a948a0 not found: ID does not exist" Sep 30 09:02:29 crc kubenswrapper[4760]: I0930 09:02:29.382046 4760 scope.go:117] "RemoveContainer" containerID="7698e76a1b9bc0f48344b5bfbe792c6302cf52cca973d06404bc6a5d62a8edcf" Sep 30 09:02:29 crc kubenswrapper[4760]: E0930 09:02:29.382488 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7698e76a1b9bc0f48344b5bfbe792c6302cf52cca973d06404bc6a5d62a8edcf\": container with ID starting with 7698e76a1b9bc0f48344b5bfbe792c6302cf52cca973d06404bc6a5d62a8edcf not found: ID does not exist" containerID="7698e76a1b9bc0f48344b5bfbe792c6302cf52cca973d06404bc6a5d62a8edcf" Sep 30 09:02:29 crc kubenswrapper[4760]: I0930 09:02:29.382658 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7698e76a1b9bc0f48344b5bfbe792c6302cf52cca973d06404bc6a5d62a8edcf"} err="failed to get container status \"7698e76a1b9bc0f48344b5bfbe792c6302cf52cca973d06404bc6a5d62a8edcf\": rpc error: code = NotFound desc = could not find container \"7698e76a1b9bc0f48344b5bfbe792c6302cf52cca973d06404bc6a5d62a8edcf\": container with ID starting with 7698e76a1b9bc0f48344b5bfbe792c6302cf52cca973d06404bc6a5d62a8edcf not found: ID does not exist" Sep 30 09:02:29 crc kubenswrapper[4760]: I0930 09:02:29.382800 4760 scope.go:117] "RemoveContainer" containerID="81c533b8a162e28792f71cb3b5b2f3763af1625cf381470ce22c6bcd56cdf2d5" Sep 30 09:02:29 crc kubenswrapper[4760]: E0930 09:02:29.383427 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81c533b8a162e28792f71cb3b5b2f3763af1625cf381470ce22c6bcd56cdf2d5\": container with ID starting with 81c533b8a162e28792f71cb3b5b2f3763af1625cf381470ce22c6bcd56cdf2d5 not found: ID does not exist" containerID="81c533b8a162e28792f71cb3b5b2f3763af1625cf381470ce22c6bcd56cdf2d5" Sep 30 09:02:29 crc kubenswrapper[4760]: I0930 09:02:29.383484 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81c533b8a162e28792f71cb3b5b2f3763af1625cf381470ce22c6bcd56cdf2d5"} err="failed to get container status \"81c533b8a162e28792f71cb3b5b2f3763af1625cf381470ce22c6bcd56cdf2d5\": rpc error: code = NotFound desc = could not find container \"81c533b8a162e28792f71cb3b5b2f3763af1625cf381470ce22c6bcd56cdf2d5\": container with ID starting with 81c533b8a162e28792f71cb3b5b2f3763af1625cf381470ce22c6bcd56cdf2d5 not found: ID does not exist" Sep 30 09:02:31 crc kubenswrapper[4760]: I0930 09:02:31.087286 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a932665-8fe2-4db0-b09e-13e7209b8f14" path="/var/lib/kubelet/pods/0a932665-8fe2-4db0-b09e-13e7209b8f14/volumes" Sep 30 09:02:59 crc kubenswrapper[4760]: I0930 09:02:59.944520 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j47v9"] Sep 30 09:02:59 crc kubenswrapper[4760]: E0930 09:02:59.945542 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a932665-8fe2-4db0-b09e-13e7209b8f14" containerName="registry-server" Sep 30 09:02:59 crc kubenswrapper[4760]: I0930 09:02:59.945555 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a932665-8fe2-4db0-b09e-13e7209b8f14" containerName="registry-server" Sep 30 09:02:59 crc kubenswrapper[4760]: E0930 09:02:59.945567 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a932665-8fe2-4db0-b09e-13e7209b8f14" containerName="extract-utilities" Sep 30 09:02:59 crc kubenswrapper[4760]: I0930 09:02:59.945573 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a932665-8fe2-4db0-b09e-13e7209b8f14" containerName="extract-utilities" Sep 30 09:02:59 crc kubenswrapper[4760]: E0930 09:02:59.945602 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a932665-8fe2-4db0-b09e-13e7209b8f14" containerName="extract-content" Sep 30 09:02:59 crc kubenswrapper[4760]: I0930 09:02:59.945609 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a932665-8fe2-4db0-b09e-13e7209b8f14" containerName="extract-content" Sep 30 09:02:59 crc kubenswrapper[4760]: I0930 09:02:59.945801 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a932665-8fe2-4db0-b09e-13e7209b8f14" containerName="registry-server" Sep 30 09:02:59 crc kubenswrapper[4760]: I0930 09:02:59.947501 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j47v9" Sep 30 09:02:59 crc kubenswrapper[4760]: I0930 09:02:59.964673 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j47v9"] Sep 30 09:03:00 crc kubenswrapper[4760]: I0930 09:03:00.123226 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715ef8cd-1191-42e3-967d-43b1b50fcb22-utilities\") pod \"redhat-operators-j47v9\" (UID: \"715ef8cd-1191-42e3-967d-43b1b50fcb22\") " pod="openshift-marketplace/redhat-operators-j47v9" Sep 30 09:03:00 crc kubenswrapper[4760]: I0930 09:03:00.123381 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4dc4\" (UniqueName: \"kubernetes.io/projected/715ef8cd-1191-42e3-967d-43b1b50fcb22-kube-api-access-v4dc4\") pod \"redhat-operators-j47v9\" (UID: \"715ef8cd-1191-42e3-967d-43b1b50fcb22\") " pod="openshift-marketplace/redhat-operators-j47v9" Sep 30 09:03:00 crc kubenswrapper[4760]: I0930 09:03:00.123450 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715ef8cd-1191-42e3-967d-43b1b50fcb22-catalog-content\") pod \"redhat-operators-j47v9\" (UID: \"715ef8cd-1191-42e3-967d-43b1b50fcb22\") " pod="openshift-marketplace/redhat-operators-j47v9" Sep 30 09:03:00 crc kubenswrapper[4760]: I0930 09:03:00.225146 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715ef8cd-1191-42e3-967d-43b1b50fcb22-utilities\") pod \"redhat-operators-j47v9\" (UID: \"715ef8cd-1191-42e3-967d-43b1b50fcb22\") " pod="openshift-marketplace/redhat-operators-j47v9" Sep 30 09:03:00 crc kubenswrapper[4760]: I0930 09:03:00.225292 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4dc4\" (UniqueName: \"kubernetes.io/projected/715ef8cd-1191-42e3-967d-43b1b50fcb22-kube-api-access-v4dc4\") pod \"redhat-operators-j47v9\" (UID: \"715ef8cd-1191-42e3-967d-43b1b50fcb22\") " pod="openshift-marketplace/redhat-operators-j47v9" Sep 30 09:03:00 crc kubenswrapper[4760]: I0930 09:03:00.225367 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715ef8cd-1191-42e3-967d-43b1b50fcb22-catalog-content\") pod \"redhat-operators-j47v9\" (UID: \"715ef8cd-1191-42e3-967d-43b1b50fcb22\") " pod="openshift-marketplace/redhat-operators-j47v9" Sep 30 09:03:00 crc kubenswrapper[4760]: I0930 09:03:00.225787 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715ef8cd-1191-42e3-967d-43b1b50fcb22-utilities\") pod \"redhat-operators-j47v9\" (UID: \"715ef8cd-1191-42e3-967d-43b1b50fcb22\") " pod="openshift-marketplace/redhat-operators-j47v9" Sep 30 09:03:00 crc kubenswrapper[4760]: I0930 09:03:00.226030 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715ef8cd-1191-42e3-967d-43b1b50fcb22-catalog-content\") pod \"redhat-operators-j47v9\" (UID: \"715ef8cd-1191-42e3-967d-43b1b50fcb22\") " pod="openshift-marketplace/redhat-operators-j47v9" Sep 30 09:03:00 crc kubenswrapper[4760]: I0930 09:03:00.248349 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4dc4\" (UniqueName: \"kubernetes.io/projected/715ef8cd-1191-42e3-967d-43b1b50fcb22-kube-api-access-v4dc4\") pod \"redhat-operators-j47v9\" (UID: \"715ef8cd-1191-42e3-967d-43b1b50fcb22\") " pod="openshift-marketplace/redhat-operators-j47v9" Sep 30 09:03:00 crc kubenswrapper[4760]: I0930 09:03:00.347986 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j47v9" Sep 30 09:03:00 crc kubenswrapper[4760]: I0930 09:03:00.803584 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j47v9"] Sep 30 09:03:01 crc kubenswrapper[4760]: I0930 09:03:01.645234 4760 generic.go:334] "Generic (PLEG): container finished" podID="715ef8cd-1191-42e3-967d-43b1b50fcb22" containerID="6e16229680499cca3f093ef40e761c6bfa6b239579de0a34a0eb70cc22141337" exitCode=0 Sep 30 09:03:01 crc kubenswrapper[4760]: I0930 09:03:01.645357 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j47v9" event={"ID":"715ef8cd-1191-42e3-967d-43b1b50fcb22","Type":"ContainerDied","Data":"6e16229680499cca3f093ef40e761c6bfa6b239579de0a34a0eb70cc22141337"} Sep 30 09:03:01 crc kubenswrapper[4760]: I0930 09:03:01.645657 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j47v9" event={"ID":"715ef8cd-1191-42e3-967d-43b1b50fcb22","Type":"ContainerStarted","Data":"6cd40c086523a6c45136337d064c509ed718a92ff8b56f629548ad9c7a1aa3e1"} Sep 30 09:03:01 crc kubenswrapper[4760]: I0930 09:03:01.648424 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 09:03:02 crc kubenswrapper[4760]: I0930 09:03:02.656046 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j47v9" event={"ID":"715ef8cd-1191-42e3-967d-43b1b50fcb22","Type":"ContainerStarted","Data":"18df63ba3e3ce1a4fb098cf7669a8353be7f860607f955bd2b16e55390c19dca"} Sep 30 09:03:03 crc kubenswrapper[4760]: I0930 09:03:03.664273 4760 generic.go:334] "Generic (PLEG): container finished" podID="715ef8cd-1191-42e3-967d-43b1b50fcb22" containerID="18df63ba3e3ce1a4fb098cf7669a8353be7f860607f955bd2b16e55390c19dca" exitCode=0 Sep 30 09:03:03 crc kubenswrapper[4760]: I0930 09:03:03.664340 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j47v9" event={"ID":"715ef8cd-1191-42e3-967d-43b1b50fcb22","Type":"ContainerDied","Data":"18df63ba3e3ce1a4fb098cf7669a8353be7f860607f955bd2b16e55390c19dca"} Sep 30 09:03:05 crc kubenswrapper[4760]: I0930 09:03:05.683814 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j47v9" event={"ID":"715ef8cd-1191-42e3-967d-43b1b50fcb22","Type":"ContainerStarted","Data":"d83e39e9bb3683550d4fce2e260a423b67b0f51a20711d2e12fd8665483d297d"} Sep 30 09:03:05 crc kubenswrapper[4760]: I0930 09:03:05.706814 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j47v9" podStartSLOduration=3.701593875 podStartE2EDuration="6.706795491s" podCreationTimestamp="2025-09-30 09:02:59 +0000 UTC" firstStartedPulling="2025-09-30 09:03:01.647855002 +0000 UTC m=+5367.290761444" lastFinishedPulling="2025-09-30 09:03:04.653056658 +0000 UTC m=+5370.295963060" observedRunningTime="2025-09-30 09:03:05.704209916 +0000 UTC m=+5371.347116328" watchObservedRunningTime="2025-09-30 09:03:05.706795491 +0000 UTC m=+5371.349701903" Sep 30 09:03:10 crc kubenswrapper[4760]: I0930 09:03:10.348577 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j47v9" Sep 30 09:03:10 crc kubenswrapper[4760]: I0930 09:03:10.349335 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j47v9" Sep 30 09:03:10 crc kubenswrapper[4760]: I0930 09:03:10.401688 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j47v9" Sep 30 09:03:10 crc kubenswrapper[4760]: I0930 09:03:10.823824 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j47v9" Sep 30 09:03:10 crc kubenswrapper[4760]: I0930 09:03:10.892878 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j47v9"] Sep 30 09:03:12 crc kubenswrapper[4760]: I0930 09:03:12.773737 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j47v9" podUID="715ef8cd-1191-42e3-967d-43b1b50fcb22" containerName="registry-server" containerID="cri-o://d83e39e9bb3683550d4fce2e260a423b67b0f51a20711d2e12fd8665483d297d" gracePeriod=2 Sep 30 09:03:13 crc kubenswrapper[4760]: I0930 09:03:13.354181 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j47v9" Sep 30 09:03:13 crc kubenswrapper[4760]: I0930 09:03:13.504482 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715ef8cd-1191-42e3-967d-43b1b50fcb22-catalog-content\") pod \"715ef8cd-1191-42e3-967d-43b1b50fcb22\" (UID: \"715ef8cd-1191-42e3-967d-43b1b50fcb22\") " Sep 30 09:03:13 crc kubenswrapper[4760]: I0930 09:03:13.504673 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4dc4\" (UniqueName: \"kubernetes.io/projected/715ef8cd-1191-42e3-967d-43b1b50fcb22-kube-api-access-v4dc4\") pod \"715ef8cd-1191-42e3-967d-43b1b50fcb22\" (UID: \"715ef8cd-1191-42e3-967d-43b1b50fcb22\") " Sep 30 09:03:13 crc kubenswrapper[4760]: I0930 09:03:13.504747 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715ef8cd-1191-42e3-967d-43b1b50fcb22-utilities\") pod \"715ef8cd-1191-42e3-967d-43b1b50fcb22\" (UID: \"715ef8cd-1191-42e3-967d-43b1b50fcb22\") " Sep 30 09:03:13 crc kubenswrapper[4760]: I0930 09:03:13.506095 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/715ef8cd-1191-42e3-967d-43b1b50fcb22-utilities" (OuterVolumeSpecName: "utilities") pod "715ef8cd-1191-42e3-967d-43b1b50fcb22" (UID: "715ef8cd-1191-42e3-967d-43b1b50fcb22"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 09:03:13 crc kubenswrapper[4760]: I0930 09:03:13.512908 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/715ef8cd-1191-42e3-967d-43b1b50fcb22-kube-api-access-v4dc4" (OuterVolumeSpecName: "kube-api-access-v4dc4") pod "715ef8cd-1191-42e3-967d-43b1b50fcb22" (UID: "715ef8cd-1191-42e3-967d-43b1b50fcb22"). InnerVolumeSpecName "kube-api-access-v4dc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 09:03:13 crc kubenswrapper[4760]: I0930 09:03:13.596390 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/715ef8cd-1191-42e3-967d-43b1b50fcb22-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "715ef8cd-1191-42e3-967d-43b1b50fcb22" (UID: "715ef8cd-1191-42e3-967d-43b1b50fcb22"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 09:03:13 crc kubenswrapper[4760]: I0930 09:03:13.607344 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715ef8cd-1191-42e3-967d-43b1b50fcb22-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 09:03:13 crc kubenswrapper[4760]: I0930 09:03:13.607384 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4dc4\" (UniqueName: \"kubernetes.io/projected/715ef8cd-1191-42e3-967d-43b1b50fcb22-kube-api-access-v4dc4\") on node \"crc\" DevicePath \"\"" Sep 30 09:03:13 crc kubenswrapper[4760]: I0930 09:03:13.607396 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715ef8cd-1191-42e3-967d-43b1b50fcb22-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 09:03:13 crc kubenswrapper[4760]: I0930 09:03:13.791040 4760 generic.go:334] "Generic (PLEG): container finished" podID="715ef8cd-1191-42e3-967d-43b1b50fcb22" containerID="d83e39e9bb3683550d4fce2e260a423b67b0f51a20711d2e12fd8665483d297d" exitCode=0 Sep 30 09:03:13 crc kubenswrapper[4760]: I0930 09:03:13.791098 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j47v9" event={"ID":"715ef8cd-1191-42e3-967d-43b1b50fcb22","Type":"ContainerDied","Data":"d83e39e9bb3683550d4fce2e260a423b67b0f51a20711d2e12fd8665483d297d"} Sep 30 09:03:13 crc kubenswrapper[4760]: I0930 09:03:13.791153 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j47v9" Sep 30 09:03:13 crc kubenswrapper[4760]: I0930 09:03:13.791178 4760 scope.go:117] "RemoveContainer" containerID="d83e39e9bb3683550d4fce2e260a423b67b0f51a20711d2e12fd8665483d297d" Sep 30 09:03:13 crc kubenswrapper[4760]: I0930 09:03:13.791164 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j47v9" event={"ID":"715ef8cd-1191-42e3-967d-43b1b50fcb22","Type":"ContainerDied","Data":"6cd40c086523a6c45136337d064c509ed718a92ff8b56f629548ad9c7a1aa3e1"} Sep 30 09:03:13 crc kubenswrapper[4760]: I0930 09:03:13.826661 4760 scope.go:117] "RemoveContainer" containerID="18df63ba3e3ce1a4fb098cf7669a8353be7f860607f955bd2b16e55390c19dca" Sep 30 09:03:13 crc kubenswrapper[4760]: I0930 09:03:13.874663 4760 scope.go:117] "RemoveContainer" containerID="6e16229680499cca3f093ef40e761c6bfa6b239579de0a34a0eb70cc22141337" Sep 30 09:03:13 crc kubenswrapper[4760]: I0930 09:03:13.876607 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j47v9"] Sep 30 09:03:13 crc kubenswrapper[4760]: I0930 09:03:13.887944 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j47v9"] Sep 30 09:03:13 crc kubenswrapper[4760]: I0930 09:03:13.916502 4760 scope.go:117] "RemoveContainer" containerID="d83e39e9bb3683550d4fce2e260a423b67b0f51a20711d2e12fd8665483d297d" Sep 30 09:03:13 crc kubenswrapper[4760]: E0930 09:03:13.916925 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d83e39e9bb3683550d4fce2e260a423b67b0f51a20711d2e12fd8665483d297d\": container with ID starting with d83e39e9bb3683550d4fce2e260a423b67b0f51a20711d2e12fd8665483d297d not found: ID does not exist" containerID="d83e39e9bb3683550d4fce2e260a423b67b0f51a20711d2e12fd8665483d297d" Sep 30 09:03:13 crc kubenswrapper[4760]: I0930 09:03:13.916987 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d83e39e9bb3683550d4fce2e260a423b67b0f51a20711d2e12fd8665483d297d"} err="failed to get container status \"d83e39e9bb3683550d4fce2e260a423b67b0f51a20711d2e12fd8665483d297d\": rpc error: code = NotFound desc = could not find container \"d83e39e9bb3683550d4fce2e260a423b67b0f51a20711d2e12fd8665483d297d\": container with ID starting with d83e39e9bb3683550d4fce2e260a423b67b0f51a20711d2e12fd8665483d297d not found: ID does not exist" Sep 30 09:03:13 crc kubenswrapper[4760]: I0930 09:03:13.917023 4760 scope.go:117] "RemoveContainer" containerID="18df63ba3e3ce1a4fb098cf7669a8353be7f860607f955bd2b16e55390c19dca" Sep 30 09:03:13 crc kubenswrapper[4760]: E0930 09:03:13.917492 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18df63ba3e3ce1a4fb098cf7669a8353be7f860607f955bd2b16e55390c19dca\": container with ID starting with 18df63ba3e3ce1a4fb098cf7669a8353be7f860607f955bd2b16e55390c19dca not found: ID does not exist" containerID="18df63ba3e3ce1a4fb098cf7669a8353be7f860607f955bd2b16e55390c19dca" Sep 30 09:03:13 crc kubenswrapper[4760]: I0930 09:03:13.917524 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18df63ba3e3ce1a4fb098cf7669a8353be7f860607f955bd2b16e55390c19dca"} err="failed to get container status \"18df63ba3e3ce1a4fb098cf7669a8353be7f860607f955bd2b16e55390c19dca\": rpc error: code = NotFound desc = could not find container \"18df63ba3e3ce1a4fb098cf7669a8353be7f860607f955bd2b16e55390c19dca\": container with ID starting with 18df63ba3e3ce1a4fb098cf7669a8353be7f860607f955bd2b16e55390c19dca not found: ID does not exist" Sep 30 09:03:13 crc kubenswrapper[4760]: I0930 09:03:13.917577 4760 scope.go:117] "RemoveContainer" containerID="6e16229680499cca3f093ef40e761c6bfa6b239579de0a34a0eb70cc22141337" Sep 30 09:03:13 crc kubenswrapper[4760]: E0930 09:03:13.917840 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e16229680499cca3f093ef40e761c6bfa6b239579de0a34a0eb70cc22141337\": container with ID starting with 6e16229680499cca3f093ef40e761c6bfa6b239579de0a34a0eb70cc22141337 not found: ID does not exist" containerID="6e16229680499cca3f093ef40e761c6bfa6b239579de0a34a0eb70cc22141337" Sep 30 09:03:13 crc kubenswrapper[4760]: I0930 09:03:13.917897 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e16229680499cca3f093ef40e761c6bfa6b239579de0a34a0eb70cc22141337"} err="failed to get container status \"6e16229680499cca3f093ef40e761c6bfa6b239579de0a34a0eb70cc22141337\": rpc error: code = NotFound desc = could not find container \"6e16229680499cca3f093ef40e761c6bfa6b239579de0a34a0eb70cc22141337\": container with ID starting with 6e16229680499cca3f093ef40e761c6bfa6b239579de0a34a0eb70cc22141337 not found: ID does not exist" Sep 30 09:03:15 crc kubenswrapper[4760]: I0930 09:03:15.086742 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="715ef8cd-1191-42e3-967d-43b1b50fcb22" path="/var/lib/kubelet/pods/715ef8cd-1191-42e3-967d-43b1b50fcb22/volumes" Sep 30 09:03:49 crc kubenswrapper[4760]: I0930 09:03:49.112897 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 09:03:49 crc kubenswrapper[4760]: I0930 09:03:49.113688 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 09:04:19 crc kubenswrapper[4760]: I0930 09:04:19.113406 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 09:04:19 crc kubenswrapper[4760]: I0930 09:04:19.115928 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 09:04:49 crc kubenswrapper[4760]: I0930 09:04:49.112918 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 09:04:49 crc kubenswrapper[4760]: I0930 09:04:49.113535 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 09:04:49 crc kubenswrapper[4760]: I0930 09:04:49.113585 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 09:04:49 crc kubenswrapper[4760]: I0930 09:04:49.114432 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f0ae7ba7d74c28a6e89df3e826469af812a29ed5c702a83eb2ce6b2043289c4e"} pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 09:04:49 crc kubenswrapper[4760]: I0930 09:04:49.114499 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" containerID="cri-o://f0ae7ba7d74c28a6e89df3e826469af812a29ed5c702a83eb2ce6b2043289c4e" gracePeriod=600 Sep 30 09:04:49 crc kubenswrapper[4760]: I0930 09:04:49.977111 4760 generic.go:334] "Generic (PLEG): container finished" podID="7a9c8270-6964-4886-87d0-227b05b76da4" containerID="f0ae7ba7d74c28a6e89df3e826469af812a29ed5c702a83eb2ce6b2043289c4e" exitCode=0 Sep 30 09:04:49 crc kubenswrapper[4760]: I0930 09:04:49.977170 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerDied","Data":"f0ae7ba7d74c28a6e89df3e826469af812a29ed5c702a83eb2ce6b2043289c4e"} Sep 30 09:04:49 crc kubenswrapper[4760]: I0930 09:04:49.977565 4760 scope.go:117] "RemoveContainer" containerID="bb3fab9e5573bca746e5903757fe3de80f4c152222a8ba46a5a998084bf0d4bc" Sep 30 09:04:50 crc kubenswrapper[4760]: I0930 09:04:50.988951 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerStarted","Data":"505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079"} Sep 30 09:07:08 crc kubenswrapper[4760]: I0930 09:07:08.595897 4760 generic.go:334] "Generic (PLEG): container finished" podID="394b8542-fe18-475f-9374-ce5c7e3820e7" containerID="a5489eda4b2de9c64278d0bb594e161c51f0e89f024771ff637c7cb465e09b5b" exitCode=0 Sep 30 09:07:08 crc kubenswrapper[4760]: I0930 09:07:08.596373 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"394b8542-fe18-475f-9374-ce5c7e3820e7","Type":"ContainerDied","Data":"a5489eda4b2de9c64278d0bb594e161c51f0e89f024771ff637c7cb465e09b5b"} Sep 30 09:07:09 crc kubenswrapper[4760]: I0930 09:07:09.959346 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.088282 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whvpw\" (UniqueName: \"kubernetes.io/projected/394b8542-fe18-475f-9374-ce5c7e3820e7-kube-api-access-whvpw\") pod \"394b8542-fe18-475f-9374-ce5c7e3820e7\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.088784 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/394b8542-fe18-475f-9374-ce5c7e3820e7-test-operator-ephemeral-temporary\") pod \"394b8542-fe18-475f-9374-ce5c7e3820e7\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.088858 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/394b8542-fe18-475f-9374-ce5c7e3820e7-ssh-key\") pod \"394b8542-fe18-475f-9374-ce5c7e3820e7\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.088987 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/394b8542-fe18-475f-9374-ce5c7e3820e7-test-operator-ephemeral-workdir\") pod \"394b8542-fe18-475f-9374-ce5c7e3820e7\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.089050 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/394b8542-fe18-475f-9374-ce5c7e3820e7-ca-certs\") pod \"394b8542-fe18-475f-9374-ce5c7e3820e7\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.089069 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"394b8542-fe18-475f-9374-ce5c7e3820e7\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.089161 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/394b8542-fe18-475f-9374-ce5c7e3820e7-config-data\") pod \"394b8542-fe18-475f-9374-ce5c7e3820e7\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.089190 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/394b8542-fe18-475f-9374-ce5c7e3820e7-openstack-config\") pod \"394b8542-fe18-475f-9374-ce5c7e3820e7\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.089247 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/394b8542-fe18-475f-9374-ce5c7e3820e7-openstack-config-secret\") pod \"394b8542-fe18-475f-9374-ce5c7e3820e7\" (UID: \"394b8542-fe18-475f-9374-ce5c7e3820e7\") " Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.089660 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/394b8542-fe18-475f-9374-ce5c7e3820e7-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "394b8542-fe18-475f-9374-ce5c7e3820e7" (UID: "394b8542-fe18-475f-9374-ce5c7e3820e7"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.089851 4760 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/394b8542-fe18-475f-9374-ce5c7e3820e7-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.091914 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/394b8542-fe18-475f-9374-ce5c7e3820e7-config-data" (OuterVolumeSpecName: "config-data") pod "394b8542-fe18-475f-9374-ce5c7e3820e7" (UID: "394b8542-fe18-475f-9374-ce5c7e3820e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.096179 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "test-operator-logs") pod "394b8542-fe18-475f-9374-ce5c7e3820e7" (UID: "394b8542-fe18-475f-9374-ce5c7e3820e7"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.113385 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/394b8542-fe18-475f-9374-ce5c7e3820e7-kube-api-access-whvpw" (OuterVolumeSpecName: "kube-api-access-whvpw") pod "394b8542-fe18-475f-9374-ce5c7e3820e7" (UID: "394b8542-fe18-475f-9374-ce5c7e3820e7"). InnerVolumeSpecName "kube-api-access-whvpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.123948 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394b8542-fe18-475f-9374-ce5c7e3820e7-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "394b8542-fe18-475f-9374-ce5c7e3820e7" (UID: "394b8542-fe18-475f-9374-ce5c7e3820e7"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.138355 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394b8542-fe18-475f-9374-ce5c7e3820e7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "394b8542-fe18-475f-9374-ce5c7e3820e7" (UID: "394b8542-fe18-475f-9374-ce5c7e3820e7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.141992 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394b8542-fe18-475f-9374-ce5c7e3820e7-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "394b8542-fe18-475f-9374-ce5c7e3820e7" (UID: "394b8542-fe18-475f-9374-ce5c7e3820e7"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.153903 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/394b8542-fe18-475f-9374-ce5c7e3820e7-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "394b8542-fe18-475f-9374-ce5c7e3820e7" (UID: "394b8542-fe18-475f-9374-ce5c7e3820e7"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.195594 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/394b8542-fe18-475f-9374-ce5c7e3820e7-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.195657 4760 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/394b8542-fe18-475f-9374-ce5c7e3820e7-ca-certs\") on node \"crc\" DevicePath \"\"" Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.195693 4760 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.195706 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/394b8542-fe18-475f-9374-ce5c7e3820e7-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.207474 4760 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/394b8542-fe18-475f-9374-ce5c7e3820e7-openstack-config\") on node \"crc\" DevicePath \"\"" Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.207505 4760 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/394b8542-fe18-475f-9374-ce5c7e3820e7-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.207524 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whvpw\" (UniqueName: \"kubernetes.io/projected/394b8542-fe18-475f-9374-ce5c7e3820e7-kube-api-access-whvpw\") on node \"crc\" DevicePath \"\"" Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.229193 4760 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.310211 4760 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.458218 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/394b8542-fe18-475f-9374-ce5c7e3820e7-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "394b8542-fe18-475f-9374-ce5c7e3820e7" (UID: "394b8542-fe18-475f-9374-ce5c7e3820e7"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.514696 4760 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/394b8542-fe18-475f-9374-ce5c7e3820e7-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.617967 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"394b8542-fe18-475f-9374-ce5c7e3820e7","Type":"ContainerDied","Data":"ab6cc4e73a67cebb20574aff849f85f2641d5b4a1c900236cbf073b06206a87b"} Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.618002 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab6cc4e73a67cebb20574aff849f85f2641d5b4a1c900236cbf073b06206a87b" Sep 30 09:07:10 crc kubenswrapper[4760]: I0930 09:07:10.618017 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 09:07:12 crc kubenswrapper[4760]: I0930 09:07:12.749957 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 30 09:07:12 crc kubenswrapper[4760]: E0930 09:07:12.750975 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="394b8542-fe18-475f-9374-ce5c7e3820e7" containerName="tempest-tests-tempest-tests-runner" Sep 30 09:07:12 crc kubenswrapper[4760]: I0930 09:07:12.750993 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="394b8542-fe18-475f-9374-ce5c7e3820e7" containerName="tempest-tests-tempest-tests-runner" Sep 30 09:07:12 crc kubenswrapper[4760]: E0930 09:07:12.751010 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715ef8cd-1191-42e3-967d-43b1b50fcb22" containerName="extract-utilities" Sep 30 09:07:12 crc kubenswrapper[4760]: I0930 09:07:12.751018 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="715ef8cd-1191-42e3-967d-43b1b50fcb22" containerName="extract-utilities" Sep 30 09:07:12 crc kubenswrapper[4760]: E0930 09:07:12.751052 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715ef8cd-1191-42e3-967d-43b1b50fcb22" containerName="extract-content" Sep 30 09:07:12 crc kubenswrapper[4760]: I0930 09:07:12.751062 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="715ef8cd-1191-42e3-967d-43b1b50fcb22" containerName="extract-content" Sep 30 09:07:12 crc kubenswrapper[4760]: E0930 09:07:12.751098 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715ef8cd-1191-42e3-967d-43b1b50fcb22" containerName="registry-server" Sep 30 09:07:12 crc kubenswrapper[4760]: I0930 09:07:12.751104 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="715ef8cd-1191-42e3-967d-43b1b50fcb22" containerName="registry-server" Sep 30 09:07:12 crc kubenswrapper[4760]: I0930 09:07:12.751370 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="715ef8cd-1191-42e3-967d-43b1b50fcb22" containerName="registry-server" Sep 30 09:07:12 crc kubenswrapper[4760]: I0930 09:07:12.751392 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="394b8542-fe18-475f-9374-ce5c7e3820e7" containerName="tempest-tests-tempest-tests-runner" Sep 30 09:07:12 crc kubenswrapper[4760]: I0930 09:07:12.752108 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 09:07:12 crc kubenswrapper[4760]: I0930 09:07:12.762241 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-shxsl" Sep 30 09:07:12 crc kubenswrapper[4760]: I0930 09:07:12.768768 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 30 09:07:12 crc kubenswrapper[4760]: I0930 09:07:12.861455 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2txmr\" (UniqueName: \"kubernetes.io/projected/bb327f16-8c82-4829-b400-a5917094069f-kube-api-access-2txmr\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"bb327f16-8c82-4829-b400-a5917094069f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 09:07:12 crc kubenswrapper[4760]: I0930 09:07:12.861542 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"bb327f16-8c82-4829-b400-a5917094069f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 09:07:12 crc kubenswrapper[4760]: I0930 09:07:12.963718 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2txmr\" (UniqueName: \"kubernetes.io/projected/bb327f16-8c82-4829-b400-a5917094069f-kube-api-access-2txmr\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"bb327f16-8c82-4829-b400-a5917094069f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 09:07:12 crc kubenswrapper[4760]: I0930 09:07:12.963851 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"bb327f16-8c82-4829-b400-a5917094069f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 09:07:12 crc kubenswrapper[4760]: I0930 09:07:12.964567 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"bb327f16-8c82-4829-b400-a5917094069f\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 09:07:12 crc kubenswrapper[4760]: I0930 09:07:12.987658 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2txmr\" (UniqueName: \"kubernetes.io/projected/bb327f16-8c82-4829-b400-a5917094069f-kube-api-access-2txmr\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"bb327f16-8c82-4829-b400-a5917094069f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 09:07:13 crc kubenswrapper[4760]: I0930 09:07:13.007598 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"bb327f16-8c82-4829-b400-a5917094069f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 09:07:13 crc kubenswrapper[4760]: I0930 09:07:13.072673 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 09:07:13 crc kubenswrapper[4760]: I0930 09:07:13.665001 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 30 09:07:14 crc kubenswrapper[4760]: I0930 09:07:14.670722 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"bb327f16-8c82-4829-b400-a5917094069f","Type":"ContainerStarted","Data":"39d99225efa587cb55b85cb0d9e078c95147327a0f5f9f1979ed89cdfe059848"} Sep 30 09:07:15 crc kubenswrapper[4760]: I0930 09:07:15.684876 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"bb327f16-8c82-4829-b400-a5917094069f","Type":"ContainerStarted","Data":"550f25cb8d45e04d658d90470f290bf3bb00327aa1e139fda918082df04d7cb1"} Sep 30 09:07:15 crc kubenswrapper[4760]: I0930 09:07:15.710495 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.498388686 podStartE2EDuration="3.710473907s" podCreationTimestamp="2025-09-30 09:07:12 +0000 UTC" firstStartedPulling="2025-09-30 09:07:13.685139416 +0000 UTC m=+5619.328045828" lastFinishedPulling="2025-09-30 09:07:14.897224597 +0000 UTC m=+5620.540131049" observedRunningTime="2025-09-30 09:07:15.704978087 +0000 UTC m=+5621.347884559" watchObservedRunningTime="2025-09-30 09:07:15.710473907 +0000 UTC m=+5621.353380329" Sep 30 09:07:19 crc kubenswrapper[4760]: I0930 09:07:19.112837 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 09:07:19 crc kubenswrapper[4760]: I0930 09:07:19.113620 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 09:07:33 crc kubenswrapper[4760]: I0930 09:07:33.266665 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sjdgg/must-gather-wvnqh"] Sep 30 09:07:33 crc kubenswrapper[4760]: I0930 09:07:33.269612 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sjdgg/must-gather-wvnqh" Sep 30 09:07:33 crc kubenswrapper[4760]: I0930 09:07:33.272632 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-sjdgg"/"openshift-service-ca.crt" Sep 30 09:07:33 crc kubenswrapper[4760]: I0930 09:07:33.272978 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-sjdgg"/"kube-root-ca.crt" Sep 30 09:07:33 crc kubenswrapper[4760]: I0930 09:07:33.275881 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sjdgg/must-gather-wvnqh"] Sep 30 09:07:33 crc kubenswrapper[4760]: I0930 09:07:33.278972 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-sjdgg"/"default-dockercfg-x56gb" Sep 30 09:07:33 crc kubenswrapper[4760]: I0930 09:07:33.313868 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7db4983c-9c7f-45b4-847d-c01071cf4c48-must-gather-output\") pod \"must-gather-wvnqh\" (UID: \"7db4983c-9c7f-45b4-847d-c01071cf4c48\") " pod="openshift-must-gather-sjdgg/must-gather-wvnqh" Sep 30 09:07:33 crc kubenswrapper[4760]: I0930 09:07:33.313911 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d57l\" (UniqueName: \"kubernetes.io/projected/7db4983c-9c7f-45b4-847d-c01071cf4c48-kube-api-access-5d57l\") pod \"must-gather-wvnqh\" (UID: \"7db4983c-9c7f-45b4-847d-c01071cf4c48\") " pod="openshift-must-gather-sjdgg/must-gather-wvnqh" Sep 30 09:07:33 crc kubenswrapper[4760]: I0930 09:07:33.415787 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7db4983c-9c7f-45b4-847d-c01071cf4c48-must-gather-output\") pod \"must-gather-wvnqh\" (UID: \"7db4983c-9c7f-45b4-847d-c01071cf4c48\") " pod="openshift-must-gather-sjdgg/must-gather-wvnqh" Sep 30 09:07:33 crc kubenswrapper[4760]: I0930 09:07:33.415834 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d57l\" (UniqueName: \"kubernetes.io/projected/7db4983c-9c7f-45b4-847d-c01071cf4c48-kube-api-access-5d57l\") pod \"must-gather-wvnqh\" (UID: \"7db4983c-9c7f-45b4-847d-c01071cf4c48\") " pod="openshift-must-gather-sjdgg/must-gather-wvnqh" Sep 30 09:07:33 crc kubenswrapper[4760]: I0930 09:07:33.417267 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7db4983c-9c7f-45b4-847d-c01071cf4c48-must-gather-output\") pod \"must-gather-wvnqh\" (UID: \"7db4983c-9c7f-45b4-847d-c01071cf4c48\") " pod="openshift-must-gather-sjdgg/must-gather-wvnqh" Sep 30 09:07:33 crc kubenswrapper[4760]: I0930 09:07:33.435115 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d57l\" (UniqueName: \"kubernetes.io/projected/7db4983c-9c7f-45b4-847d-c01071cf4c48-kube-api-access-5d57l\") pod \"must-gather-wvnqh\" (UID: \"7db4983c-9c7f-45b4-847d-c01071cf4c48\") " pod="openshift-must-gather-sjdgg/must-gather-wvnqh" Sep 30 09:07:33 crc kubenswrapper[4760]: I0930 09:07:33.595659 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sjdgg/must-gather-wvnqh" Sep 30 09:07:34 crc kubenswrapper[4760]: I0930 09:07:34.075811 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sjdgg/must-gather-wvnqh"] Sep 30 09:07:34 crc kubenswrapper[4760]: I0930 09:07:34.915113 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sjdgg/must-gather-wvnqh" event={"ID":"7db4983c-9c7f-45b4-847d-c01071cf4c48","Type":"ContainerStarted","Data":"0fbf3ecf571760041196c70dca2f2d4115ca45051a867a0fa2794ada0f15a4f5"} Sep 30 09:07:38 crc kubenswrapper[4760]: I0930 09:07:38.963902 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sjdgg/must-gather-wvnqh" event={"ID":"7db4983c-9c7f-45b4-847d-c01071cf4c48","Type":"ContainerStarted","Data":"9b2e7cc7d231ebf8d3d776c12f2df9b9e1d4487a565e367258b5945a8054b142"} Sep 30 09:07:38 crc kubenswrapper[4760]: I0930 09:07:38.964775 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sjdgg/must-gather-wvnqh" event={"ID":"7db4983c-9c7f-45b4-847d-c01071cf4c48","Type":"ContainerStarted","Data":"518cb33ba9e23c7b18db81504ce9e92e03d66cbec027c61bed6f33c9873a6aec"} Sep 30 09:07:38 crc kubenswrapper[4760]: I0930 09:07:38.986794 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-sjdgg/must-gather-wvnqh" podStartSLOduration=1.794925876 podStartE2EDuration="5.986774409s" podCreationTimestamp="2025-09-30 09:07:33 +0000 UTC" firstStartedPulling="2025-09-30 09:07:34.08366547 +0000 UTC m=+5639.726571912" lastFinishedPulling="2025-09-30 09:07:38.275514023 +0000 UTC m=+5643.918420445" observedRunningTime="2025-09-30 09:07:38.978831937 +0000 UTC m=+5644.621738349" watchObservedRunningTime="2025-09-30 09:07:38.986774409 +0000 UTC m=+5644.629680821" Sep 30 09:07:43 crc kubenswrapper[4760]: I0930 09:07:43.406711 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sjdgg/crc-debug-tzvbh"] Sep 30 09:07:43 crc kubenswrapper[4760]: I0930 09:07:43.415078 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sjdgg/crc-debug-tzvbh" Sep 30 09:07:43 crc kubenswrapper[4760]: I0930 09:07:43.541917 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a7326a8e-4da0-47cf-bfd3-d9084fb32702-host\") pod \"crc-debug-tzvbh\" (UID: \"a7326a8e-4da0-47cf-bfd3-d9084fb32702\") " pod="openshift-must-gather-sjdgg/crc-debug-tzvbh" Sep 30 09:07:43 crc kubenswrapper[4760]: I0930 09:07:43.542442 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wvhj\" (UniqueName: \"kubernetes.io/projected/a7326a8e-4da0-47cf-bfd3-d9084fb32702-kube-api-access-9wvhj\") pod \"crc-debug-tzvbh\" (UID: \"a7326a8e-4da0-47cf-bfd3-d9084fb32702\") " pod="openshift-must-gather-sjdgg/crc-debug-tzvbh" Sep 30 09:07:43 crc kubenswrapper[4760]: I0930 09:07:43.644682 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wvhj\" (UniqueName: \"kubernetes.io/projected/a7326a8e-4da0-47cf-bfd3-d9084fb32702-kube-api-access-9wvhj\") pod \"crc-debug-tzvbh\" (UID: \"a7326a8e-4da0-47cf-bfd3-d9084fb32702\") " pod="openshift-must-gather-sjdgg/crc-debug-tzvbh" Sep 30 09:07:43 crc kubenswrapper[4760]: I0930 09:07:43.644837 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a7326a8e-4da0-47cf-bfd3-d9084fb32702-host\") pod \"crc-debug-tzvbh\" (UID: \"a7326a8e-4da0-47cf-bfd3-d9084fb32702\") " pod="openshift-must-gather-sjdgg/crc-debug-tzvbh" Sep 30 09:07:43 crc kubenswrapper[4760]: I0930 09:07:43.644951 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a7326a8e-4da0-47cf-bfd3-d9084fb32702-host\") pod \"crc-debug-tzvbh\" (UID: \"a7326a8e-4da0-47cf-bfd3-d9084fb32702\") " pod="openshift-must-gather-sjdgg/crc-debug-tzvbh" Sep 30 09:07:43 crc kubenswrapper[4760]: I0930 09:07:43.665667 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wvhj\" (UniqueName: \"kubernetes.io/projected/a7326a8e-4da0-47cf-bfd3-d9084fb32702-kube-api-access-9wvhj\") pod \"crc-debug-tzvbh\" (UID: \"a7326a8e-4da0-47cf-bfd3-d9084fb32702\") " pod="openshift-must-gather-sjdgg/crc-debug-tzvbh" Sep 30 09:07:43 crc kubenswrapper[4760]: I0930 09:07:43.746832 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sjdgg/crc-debug-tzvbh" Sep 30 09:07:44 crc kubenswrapper[4760]: I0930 09:07:44.011777 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sjdgg/crc-debug-tzvbh" event={"ID":"a7326a8e-4da0-47cf-bfd3-d9084fb32702","Type":"ContainerStarted","Data":"81a604a2a49a57b5a047a650c36263fd98ed13005e353e6d5ad1c78f124a8c9d"} Sep 30 09:07:49 crc kubenswrapper[4760]: I0930 09:07:49.112849 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 09:07:49 crc kubenswrapper[4760]: I0930 09:07:49.113376 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 09:07:55 crc kubenswrapper[4760]: I0930 09:07:55.122244 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sjdgg/crc-debug-tzvbh" event={"ID":"a7326a8e-4da0-47cf-bfd3-d9084fb32702","Type":"ContainerStarted","Data":"b4f41e3bc512b3212ea9f8e41aadbc661dd84f47ad67be4147ac357a8717711a"} Sep 30 09:07:55 crc kubenswrapper[4760]: I0930 09:07:55.142422 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-sjdgg/crc-debug-tzvbh" podStartSLOduration=1.86210321 podStartE2EDuration="12.142399983s" podCreationTimestamp="2025-09-30 09:07:43 +0000 UTC" firstStartedPulling="2025-09-30 09:07:43.793561178 +0000 UTC m=+5649.436467590" lastFinishedPulling="2025-09-30 09:07:54.073857951 +0000 UTC m=+5659.716764363" observedRunningTime="2025-09-30 09:07:55.133578118 +0000 UTC m=+5660.776484540" watchObservedRunningTime="2025-09-30 09:07:55.142399983 +0000 UTC m=+5660.785306395" Sep 30 09:08:00 crc kubenswrapper[4760]: I0930 09:08:00.313018 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wjjnr"] Sep 30 09:08:00 crc kubenswrapper[4760]: I0930 09:08:00.315565 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wjjnr" Sep 30 09:08:00 crc kubenswrapper[4760]: I0930 09:08:00.331220 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wjjnr"] Sep 30 09:08:00 crc kubenswrapper[4760]: I0930 09:08:00.379605 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/877b4e69-9337-48d0-b822-933a3a44a869-catalog-content\") pod \"redhat-marketplace-wjjnr\" (UID: \"877b4e69-9337-48d0-b822-933a3a44a869\") " pod="openshift-marketplace/redhat-marketplace-wjjnr" Sep 30 09:08:00 crc kubenswrapper[4760]: I0930 09:08:00.380062 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/877b4e69-9337-48d0-b822-933a3a44a869-utilities\") pod \"redhat-marketplace-wjjnr\" (UID: \"877b4e69-9337-48d0-b822-933a3a44a869\") " pod="openshift-marketplace/redhat-marketplace-wjjnr" Sep 30 09:08:00 crc kubenswrapper[4760]: I0930 09:08:00.380228 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm77j\" (UniqueName: \"kubernetes.io/projected/877b4e69-9337-48d0-b822-933a3a44a869-kube-api-access-jm77j\") pod \"redhat-marketplace-wjjnr\" (UID: \"877b4e69-9337-48d0-b822-933a3a44a869\") " pod="openshift-marketplace/redhat-marketplace-wjjnr" Sep 30 09:08:00 crc kubenswrapper[4760]: I0930 09:08:00.481907 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/877b4e69-9337-48d0-b822-933a3a44a869-catalog-content\") pod \"redhat-marketplace-wjjnr\" (UID: \"877b4e69-9337-48d0-b822-933a3a44a869\") " pod="openshift-marketplace/redhat-marketplace-wjjnr" Sep 30 09:08:00 crc kubenswrapper[4760]: I0930 09:08:00.482251 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/877b4e69-9337-48d0-b822-933a3a44a869-utilities\") pod \"redhat-marketplace-wjjnr\" (UID: \"877b4e69-9337-48d0-b822-933a3a44a869\") " pod="openshift-marketplace/redhat-marketplace-wjjnr" Sep 30 09:08:00 crc kubenswrapper[4760]: I0930 09:08:00.482400 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm77j\" (UniqueName: \"kubernetes.io/projected/877b4e69-9337-48d0-b822-933a3a44a869-kube-api-access-jm77j\") pod \"redhat-marketplace-wjjnr\" (UID: \"877b4e69-9337-48d0-b822-933a3a44a869\") " pod="openshift-marketplace/redhat-marketplace-wjjnr" Sep 30 09:08:00 crc kubenswrapper[4760]: I0930 09:08:00.482527 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/877b4e69-9337-48d0-b822-933a3a44a869-catalog-content\") pod \"redhat-marketplace-wjjnr\" (UID: \"877b4e69-9337-48d0-b822-933a3a44a869\") " pod="openshift-marketplace/redhat-marketplace-wjjnr" Sep 30 09:08:00 crc kubenswrapper[4760]: I0930 09:08:00.482778 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/877b4e69-9337-48d0-b822-933a3a44a869-utilities\") pod \"redhat-marketplace-wjjnr\" (UID: \"877b4e69-9337-48d0-b822-933a3a44a869\") " pod="openshift-marketplace/redhat-marketplace-wjjnr" Sep 30 09:08:00 crc kubenswrapper[4760]: I0930 09:08:00.502626 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm77j\" (UniqueName: \"kubernetes.io/projected/877b4e69-9337-48d0-b822-933a3a44a869-kube-api-access-jm77j\") pod \"redhat-marketplace-wjjnr\" (UID: \"877b4e69-9337-48d0-b822-933a3a44a869\") " pod="openshift-marketplace/redhat-marketplace-wjjnr" Sep 30 09:08:00 crc kubenswrapper[4760]: I0930 09:08:00.674275 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wjjnr" Sep 30 09:08:01 crc kubenswrapper[4760]: I0930 09:08:01.162631 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wjjnr"] Sep 30 09:08:03 crc kubenswrapper[4760]: I0930 09:08:03.191034 4760 generic.go:334] "Generic (PLEG): container finished" podID="877b4e69-9337-48d0-b822-933a3a44a869" containerID="0148b6f0d763f97f2ef95dfc9693eec24060ecdf4e903058425a55f62403ac86" exitCode=0 Sep 30 09:08:03 crc kubenswrapper[4760]: I0930 09:08:03.191737 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wjjnr" event={"ID":"877b4e69-9337-48d0-b822-933a3a44a869","Type":"ContainerDied","Data":"0148b6f0d763f97f2ef95dfc9693eec24060ecdf4e903058425a55f62403ac86"} Sep 30 09:08:03 crc kubenswrapper[4760]: I0930 09:08:03.191777 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wjjnr" event={"ID":"877b4e69-9337-48d0-b822-933a3a44a869","Type":"ContainerStarted","Data":"a938fda54e98dd7bb1cb4266094e90ba04ce5a52bc5a8ab0111ca14b95fd11bd"} Sep 30 09:08:03 crc kubenswrapper[4760]: I0930 09:08:03.200523 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 09:08:05 crc kubenswrapper[4760]: I0930 09:08:05.217585 4760 generic.go:334] "Generic (PLEG): container finished" podID="877b4e69-9337-48d0-b822-933a3a44a869" containerID="054257328d005f1773f8ddc7d4076563e8ec67ddbd93519f80e48d055e64b45d" exitCode=0 Sep 30 09:08:05 crc kubenswrapper[4760]: I0930 09:08:05.217831 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wjjnr" event={"ID":"877b4e69-9337-48d0-b822-933a3a44a869","Type":"ContainerDied","Data":"054257328d005f1773f8ddc7d4076563e8ec67ddbd93519f80e48d055e64b45d"} Sep 30 09:08:06 crc kubenswrapper[4760]: I0930 09:08:06.265476 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wjjnr" event={"ID":"877b4e69-9337-48d0-b822-933a3a44a869","Type":"ContainerStarted","Data":"37f9195b8ae2921118fb90e9f1db01eb4282268e2fec15afe949821093fa2110"} Sep 30 09:08:06 crc kubenswrapper[4760]: I0930 09:08:06.304565 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wjjnr" podStartSLOduration=3.796156735 podStartE2EDuration="6.304544159s" podCreationTimestamp="2025-09-30 09:08:00 +0000 UTC" firstStartedPulling="2025-09-30 09:08:03.199935254 +0000 UTC m=+5668.842841706" lastFinishedPulling="2025-09-30 09:08:05.708322718 +0000 UTC m=+5671.351229130" observedRunningTime="2025-09-30 09:08:06.284765976 +0000 UTC m=+5671.927672388" watchObservedRunningTime="2025-09-30 09:08:06.304544159 +0000 UTC m=+5671.947450581" Sep 30 09:08:07 crc kubenswrapper[4760]: I0930 09:08:07.109678 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nlqfl"] Sep 30 09:08:07 crc kubenswrapper[4760]: I0930 09:08:07.112293 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nlqfl" Sep 30 09:08:07 crc kubenswrapper[4760]: I0930 09:08:07.124989 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nlqfl"] Sep 30 09:08:07 crc kubenswrapper[4760]: I0930 09:08:07.225573 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c82793a5-07f2-4daf-9413-8bd85869e93e-utilities\") pod \"certified-operators-nlqfl\" (UID: \"c82793a5-07f2-4daf-9413-8bd85869e93e\") " pod="openshift-marketplace/certified-operators-nlqfl" Sep 30 09:08:07 crc kubenswrapper[4760]: I0930 09:08:07.225649 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c82793a5-07f2-4daf-9413-8bd85869e93e-catalog-content\") pod \"certified-operators-nlqfl\" (UID: \"c82793a5-07f2-4daf-9413-8bd85869e93e\") " pod="openshift-marketplace/certified-operators-nlqfl" Sep 30 09:08:07 crc kubenswrapper[4760]: I0930 09:08:07.226084 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqrnq\" (UniqueName: \"kubernetes.io/projected/c82793a5-07f2-4daf-9413-8bd85869e93e-kube-api-access-rqrnq\") pod \"certified-operators-nlqfl\" (UID: \"c82793a5-07f2-4daf-9413-8bd85869e93e\") " pod="openshift-marketplace/certified-operators-nlqfl" Sep 30 09:08:07 crc kubenswrapper[4760]: I0930 09:08:07.328402 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqrnq\" (UniqueName: \"kubernetes.io/projected/c82793a5-07f2-4daf-9413-8bd85869e93e-kube-api-access-rqrnq\") pod \"certified-operators-nlqfl\" (UID: \"c82793a5-07f2-4daf-9413-8bd85869e93e\") " pod="openshift-marketplace/certified-operators-nlqfl" Sep 30 09:08:07 crc kubenswrapper[4760]: I0930 09:08:07.328755 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c82793a5-07f2-4daf-9413-8bd85869e93e-utilities\") pod \"certified-operators-nlqfl\" (UID: \"c82793a5-07f2-4daf-9413-8bd85869e93e\") " pod="openshift-marketplace/certified-operators-nlqfl" Sep 30 09:08:07 crc kubenswrapper[4760]: I0930 09:08:07.328817 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c82793a5-07f2-4daf-9413-8bd85869e93e-catalog-content\") pod \"certified-operators-nlqfl\" (UID: \"c82793a5-07f2-4daf-9413-8bd85869e93e\") " pod="openshift-marketplace/certified-operators-nlqfl" Sep 30 09:08:07 crc kubenswrapper[4760]: I0930 09:08:07.329329 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c82793a5-07f2-4daf-9413-8bd85869e93e-utilities\") pod \"certified-operators-nlqfl\" (UID: \"c82793a5-07f2-4daf-9413-8bd85869e93e\") " pod="openshift-marketplace/certified-operators-nlqfl" Sep 30 09:08:07 crc kubenswrapper[4760]: I0930 09:08:07.329908 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c82793a5-07f2-4daf-9413-8bd85869e93e-catalog-content\") pod \"certified-operators-nlqfl\" (UID: \"c82793a5-07f2-4daf-9413-8bd85869e93e\") " pod="openshift-marketplace/certified-operators-nlqfl" Sep 30 09:08:07 crc kubenswrapper[4760]: I0930 09:08:07.350610 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqrnq\" (UniqueName: \"kubernetes.io/projected/c82793a5-07f2-4daf-9413-8bd85869e93e-kube-api-access-rqrnq\") pod \"certified-operators-nlqfl\" (UID: \"c82793a5-07f2-4daf-9413-8bd85869e93e\") " pod="openshift-marketplace/certified-operators-nlqfl" Sep 30 09:08:07 crc kubenswrapper[4760]: I0930 09:08:07.489596 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nlqfl" Sep 30 09:08:08 crc kubenswrapper[4760]: I0930 09:08:08.098056 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nlqfl"] Sep 30 09:08:08 crc kubenswrapper[4760]: I0930 09:08:08.283034 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlqfl" event={"ID":"c82793a5-07f2-4daf-9413-8bd85869e93e","Type":"ContainerStarted","Data":"d01abfd41340c7a94a665e24428f99a1b90ab8c13109e7cb6cfce5123949f229"} Sep 30 09:08:09 crc kubenswrapper[4760]: E0930 09:08:09.649214 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc82793a5_07f2_4daf_9413_8bd85869e93e.slice/crio-conmon-135d1d413c0bf008b38db08485153bd8afd9f31c49457438c8d1f895d36c898d.scope\": RecentStats: unable to find data in memory cache]" Sep 30 09:08:10 crc kubenswrapper[4760]: I0930 09:08:10.302733 4760 generic.go:334] "Generic (PLEG): container finished" podID="c82793a5-07f2-4daf-9413-8bd85869e93e" containerID="135d1d413c0bf008b38db08485153bd8afd9f31c49457438c8d1f895d36c898d" exitCode=0 Sep 30 09:08:10 crc kubenswrapper[4760]: I0930 09:08:10.302841 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlqfl" event={"ID":"c82793a5-07f2-4daf-9413-8bd85869e93e","Type":"ContainerDied","Data":"135d1d413c0bf008b38db08485153bd8afd9f31c49457438c8d1f895d36c898d"} Sep 30 09:08:10 crc kubenswrapper[4760]: I0930 09:08:10.674837 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wjjnr" Sep 30 09:08:10 crc kubenswrapper[4760]: I0930 09:08:10.675295 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wjjnr" Sep 30 09:08:10 crc kubenswrapper[4760]: I0930 09:08:10.730375 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wjjnr" Sep 30 09:08:12 crc kubenswrapper[4760]: I0930 09:08:12.143466 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wjjnr" Sep 30 09:08:12 crc kubenswrapper[4760]: I0930 09:08:12.321329 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlqfl" event={"ID":"c82793a5-07f2-4daf-9413-8bd85869e93e","Type":"ContainerStarted","Data":"a7df648a2aa78ecb428251e708f19a7763c731494a1e93d2860de8fcefeefdd2"} Sep 30 09:08:14 crc kubenswrapper[4760]: I0930 09:08:14.339323 4760 generic.go:334] "Generic (PLEG): container finished" podID="c82793a5-07f2-4daf-9413-8bd85869e93e" containerID="a7df648a2aa78ecb428251e708f19a7763c731494a1e93d2860de8fcefeefdd2" exitCode=0 Sep 30 09:08:14 crc kubenswrapper[4760]: I0930 09:08:14.339860 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlqfl" event={"ID":"c82793a5-07f2-4daf-9413-8bd85869e93e","Type":"ContainerDied","Data":"a7df648a2aa78ecb428251e708f19a7763c731494a1e93d2860de8fcefeefdd2"} Sep 30 09:08:15 crc kubenswrapper[4760]: I0930 09:08:15.102606 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wjjnr"] Sep 30 09:08:15 crc kubenswrapper[4760]: I0930 09:08:15.103232 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wjjnr" podUID="877b4e69-9337-48d0-b822-933a3a44a869" containerName="registry-server" containerID="cri-o://37f9195b8ae2921118fb90e9f1db01eb4282268e2fec15afe949821093fa2110" gracePeriod=2 Sep 30 09:08:15 crc kubenswrapper[4760]: I0930 09:08:15.354771 4760 generic.go:334] "Generic (PLEG): container finished" podID="877b4e69-9337-48d0-b822-933a3a44a869" containerID="37f9195b8ae2921118fb90e9f1db01eb4282268e2fec15afe949821093fa2110" exitCode=0 Sep 30 09:08:15 crc kubenswrapper[4760]: I0930 09:08:15.354861 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wjjnr" event={"ID":"877b4e69-9337-48d0-b822-933a3a44a869","Type":"ContainerDied","Data":"37f9195b8ae2921118fb90e9f1db01eb4282268e2fec15afe949821093fa2110"} Sep 30 09:08:15 crc kubenswrapper[4760]: I0930 09:08:15.356849 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlqfl" event={"ID":"c82793a5-07f2-4daf-9413-8bd85869e93e","Type":"ContainerStarted","Data":"9c8bc0ca9746ffdf89d49cb3a9dc813449d610fc7f0de80df4b600f3b3d033b1"} Sep 30 09:08:15 crc kubenswrapper[4760]: I0930 09:08:15.400717 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nlqfl" podStartSLOduration=3.939681283 podStartE2EDuration="8.4006978s" podCreationTimestamp="2025-09-30 09:08:07 +0000 UTC" firstStartedPulling="2025-09-30 09:08:10.304941924 +0000 UTC m=+5675.947848336" lastFinishedPulling="2025-09-30 09:08:14.765958431 +0000 UTC m=+5680.408864853" observedRunningTime="2025-09-30 09:08:15.37353671 +0000 UTC m=+5681.016443142" watchObservedRunningTime="2025-09-30 09:08:15.4006978 +0000 UTC m=+5681.043604212" Sep 30 09:08:15 crc kubenswrapper[4760]: I0930 09:08:15.599848 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wjjnr" Sep 30 09:08:15 crc kubenswrapper[4760]: I0930 09:08:15.728070 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/877b4e69-9337-48d0-b822-933a3a44a869-utilities\") pod \"877b4e69-9337-48d0-b822-933a3a44a869\" (UID: \"877b4e69-9337-48d0-b822-933a3a44a869\") " Sep 30 09:08:15 crc kubenswrapper[4760]: I0930 09:08:15.728252 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/877b4e69-9337-48d0-b822-933a3a44a869-catalog-content\") pod \"877b4e69-9337-48d0-b822-933a3a44a869\" (UID: \"877b4e69-9337-48d0-b822-933a3a44a869\") " Sep 30 09:08:15 crc kubenswrapper[4760]: I0930 09:08:15.728459 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jm77j\" (UniqueName: \"kubernetes.io/projected/877b4e69-9337-48d0-b822-933a3a44a869-kube-api-access-jm77j\") pod \"877b4e69-9337-48d0-b822-933a3a44a869\" (UID: \"877b4e69-9337-48d0-b822-933a3a44a869\") " Sep 30 09:08:15 crc kubenswrapper[4760]: I0930 09:08:15.728853 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/877b4e69-9337-48d0-b822-933a3a44a869-utilities" (OuterVolumeSpecName: "utilities") pod "877b4e69-9337-48d0-b822-933a3a44a869" (UID: "877b4e69-9337-48d0-b822-933a3a44a869"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 09:08:15 crc kubenswrapper[4760]: I0930 09:08:15.729394 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/877b4e69-9337-48d0-b822-933a3a44a869-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 09:08:15 crc kubenswrapper[4760]: I0930 09:08:15.741077 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/877b4e69-9337-48d0-b822-933a3a44a869-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "877b4e69-9337-48d0-b822-933a3a44a869" (UID: "877b4e69-9337-48d0-b822-933a3a44a869"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 09:08:15 crc kubenswrapper[4760]: I0930 09:08:15.748212 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/877b4e69-9337-48d0-b822-933a3a44a869-kube-api-access-jm77j" (OuterVolumeSpecName: "kube-api-access-jm77j") pod "877b4e69-9337-48d0-b822-933a3a44a869" (UID: "877b4e69-9337-48d0-b822-933a3a44a869"). InnerVolumeSpecName "kube-api-access-jm77j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 09:08:15 crc kubenswrapper[4760]: I0930 09:08:15.830793 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/877b4e69-9337-48d0-b822-933a3a44a869-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 09:08:15 crc kubenswrapper[4760]: I0930 09:08:15.830827 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jm77j\" (UniqueName: \"kubernetes.io/projected/877b4e69-9337-48d0-b822-933a3a44a869-kube-api-access-jm77j\") on node \"crc\" DevicePath \"\"" Sep 30 09:08:16 crc kubenswrapper[4760]: I0930 09:08:16.367858 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wjjnr" event={"ID":"877b4e69-9337-48d0-b822-933a3a44a869","Type":"ContainerDied","Data":"a938fda54e98dd7bb1cb4266094e90ba04ce5a52bc5a8ab0111ca14b95fd11bd"} Sep 30 09:08:16 crc kubenswrapper[4760]: I0930 09:08:16.367925 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wjjnr" Sep 30 09:08:16 crc kubenswrapper[4760]: I0930 09:08:16.368188 4760 scope.go:117] "RemoveContainer" containerID="37f9195b8ae2921118fb90e9f1db01eb4282268e2fec15afe949821093fa2110" Sep 30 09:08:16 crc kubenswrapper[4760]: I0930 09:08:16.397015 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wjjnr"] Sep 30 09:08:16 crc kubenswrapper[4760]: I0930 09:08:16.400743 4760 scope.go:117] "RemoveContainer" containerID="054257328d005f1773f8ddc7d4076563e8ec67ddbd93519f80e48d055e64b45d" Sep 30 09:08:16 crc kubenswrapper[4760]: I0930 09:08:16.405755 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wjjnr"] Sep 30 09:08:16 crc kubenswrapper[4760]: I0930 09:08:16.428967 4760 scope.go:117] "RemoveContainer" containerID="0148b6f0d763f97f2ef95dfc9693eec24060ecdf4e903058425a55f62403ac86" Sep 30 09:08:17 crc kubenswrapper[4760]: I0930 09:08:17.078417 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="877b4e69-9337-48d0-b822-933a3a44a869" path="/var/lib/kubelet/pods/877b4e69-9337-48d0-b822-933a3a44a869/volumes" Sep 30 09:08:17 crc kubenswrapper[4760]: I0930 09:08:17.490692 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nlqfl" Sep 30 09:08:17 crc kubenswrapper[4760]: I0930 09:08:17.490973 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nlqfl" Sep 30 09:08:17 crc kubenswrapper[4760]: I0930 09:08:17.551578 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nlqfl" Sep 30 09:08:19 crc kubenswrapper[4760]: I0930 09:08:19.112618 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 09:08:19 crc kubenswrapper[4760]: I0930 09:08:19.114106 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 09:08:19 crc kubenswrapper[4760]: I0930 09:08:19.114226 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 09:08:19 crc kubenswrapper[4760]: I0930 09:08:19.115048 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079"} pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 09:08:19 crc kubenswrapper[4760]: I0930 09:08:19.115173 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" containerID="cri-o://505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079" gracePeriod=600 Sep 30 09:08:19 crc kubenswrapper[4760]: E0930 09:08:19.272917 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:08:19 crc kubenswrapper[4760]: I0930 09:08:19.394196 4760 generic.go:334] "Generic (PLEG): container finished" podID="7a9c8270-6964-4886-87d0-227b05b76da4" containerID="505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079" exitCode=0 Sep 30 09:08:19 crc kubenswrapper[4760]: I0930 09:08:19.394282 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerDied","Data":"505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079"} Sep 30 09:08:19 crc kubenswrapper[4760]: I0930 09:08:19.394363 4760 scope.go:117] "RemoveContainer" containerID="f0ae7ba7d74c28a6e89df3e826469af812a29ed5c702a83eb2ce6b2043289c4e" Sep 30 09:08:19 crc kubenswrapper[4760]: I0930 09:08:19.394832 4760 scope.go:117] "RemoveContainer" containerID="505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079" Sep 30 09:08:19 crc kubenswrapper[4760]: E0930 09:08:19.395111 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:08:27 crc kubenswrapper[4760]: I0930 09:08:27.537616 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nlqfl" Sep 30 09:08:27 crc kubenswrapper[4760]: I0930 09:08:27.601610 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nlqfl"] Sep 30 09:08:28 crc kubenswrapper[4760]: I0930 09:08:28.478088 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nlqfl" podUID="c82793a5-07f2-4daf-9413-8bd85869e93e" containerName="registry-server" containerID="cri-o://9c8bc0ca9746ffdf89d49cb3a9dc813449d610fc7f0de80df4b600f3b3d033b1" gracePeriod=2 Sep 30 09:08:28 crc kubenswrapper[4760]: I0930 09:08:28.942120 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nlqfl" Sep 30 09:08:29 crc kubenswrapper[4760]: I0930 09:08:29.103266 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqrnq\" (UniqueName: \"kubernetes.io/projected/c82793a5-07f2-4daf-9413-8bd85869e93e-kube-api-access-rqrnq\") pod \"c82793a5-07f2-4daf-9413-8bd85869e93e\" (UID: \"c82793a5-07f2-4daf-9413-8bd85869e93e\") " Sep 30 09:08:29 crc kubenswrapper[4760]: I0930 09:08:29.103700 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c82793a5-07f2-4daf-9413-8bd85869e93e-utilities\") pod \"c82793a5-07f2-4daf-9413-8bd85869e93e\" (UID: \"c82793a5-07f2-4daf-9413-8bd85869e93e\") " Sep 30 09:08:29 crc kubenswrapper[4760]: I0930 09:08:29.103723 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c82793a5-07f2-4daf-9413-8bd85869e93e-catalog-content\") pod \"c82793a5-07f2-4daf-9413-8bd85869e93e\" (UID: \"c82793a5-07f2-4daf-9413-8bd85869e93e\") " Sep 30 09:08:29 crc kubenswrapper[4760]: I0930 09:08:29.106523 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c82793a5-07f2-4daf-9413-8bd85869e93e-utilities" (OuterVolumeSpecName: "utilities") pod "c82793a5-07f2-4daf-9413-8bd85869e93e" (UID: "c82793a5-07f2-4daf-9413-8bd85869e93e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 09:08:29 crc kubenswrapper[4760]: I0930 09:08:29.127262 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c82793a5-07f2-4daf-9413-8bd85869e93e-kube-api-access-rqrnq" (OuterVolumeSpecName: "kube-api-access-rqrnq") pod "c82793a5-07f2-4daf-9413-8bd85869e93e" (UID: "c82793a5-07f2-4daf-9413-8bd85869e93e"). InnerVolumeSpecName "kube-api-access-rqrnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 09:08:29 crc kubenswrapper[4760]: I0930 09:08:29.180163 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c82793a5-07f2-4daf-9413-8bd85869e93e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c82793a5-07f2-4daf-9413-8bd85869e93e" (UID: "c82793a5-07f2-4daf-9413-8bd85869e93e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 09:08:29 crc kubenswrapper[4760]: I0930 09:08:29.206568 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqrnq\" (UniqueName: \"kubernetes.io/projected/c82793a5-07f2-4daf-9413-8bd85869e93e-kube-api-access-rqrnq\") on node \"crc\" DevicePath \"\"" Sep 30 09:08:29 crc kubenswrapper[4760]: I0930 09:08:29.206621 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c82793a5-07f2-4daf-9413-8bd85869e93e-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 09:08:29 crc kubenswrapper[4760]: I0930 09:08:29.206633 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c82793a5-07f2-4daf-9413-8bd85869e93e-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 09:08:29 crc kubenswrapper[4760]: I0930 09:08:29.493146 4760 generic.go:334] "Generic (PLEG): container finished" podID="c82793a5-07f2-4daf-9413-8bd85869e93e" containerID="9c8bc0ca9746ffdf89d49cb3a9dc813449d610fc7f0de80df4b600f3b3d033b1" exitCode=0 Sep 30 09:08:29 crc kubenswrapper[4760]: I0930 09:08:29.493202 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlqfl" event={"ID":"c82793a5-07f2-4daf-9413-8bd85869e93e","Type":"ContainerDied","Data":"9c8bc0ca9746ffdf89d49cb3a9dc813449d610fc7f0de80df4b600f3b3d033b1"} Sep 30 09:08:29 crc kubenswrapper[4760]: I0930 09:08:29.493227 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlqfl" event={"ID":"c82793a5-07f2-4daf-9413-8bd85869e93e","Type":"ContainerDied","Data":"d01abfd41340c7a94a665e24428f99a1b90ab8c13109e7cb6cfce5123949f229"} Sep 30 09:08:29 crc kubenswrapper[4760]: I0930 09:08:29.493246 4760 scope.go:117] "RemoveContainer" containerID="9c8bc0ca9746ffdf89d49cb3a9dc813449d610fc7f0de80df4b600f3b3d033b1" Sep 30 09:08:29 crc kubenswrapper[4760]: I0930 09:08:29.493442 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nlqfl" Sep 30 09:08:29 crc kubenswrapper[4760]: I0930 09:08:29.528676 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nlqfl"] Sep 30 09:08:29 crc kubenswrapper[4760]: I0930 09:08:29.536249 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nlqfl"] Sep 30 09:08:29 crc kubenswrapper[4760]: I0930 09:08:29.539933 4760 scope.go:117] "RemoveContainer" containerID="a7df648a2aa78ecb428251e708f19a7763c731494a1e93d2860de8fcefeefdd2" Sep 30 09:08:29 crc kubenswrapper[4760]: I0930 09:08:29.575871 4760 scope.go:117] "RemoveContainer" containerID="135d1d413c0bf008b38db08485153bd8afd9f31c49457438c8d1f895d36c898d" Sep 30 09:08:29 crc kubenswrapper[4760]: I0930 09:08:29.624031 4760 scope.go:117] "RemoveContainer" containerID="9c8bc0ca9746ffdf89d49cb3a9dc813449d610fc7f0de80df4b600f3b3d033b1" Sep 30 09:08:29 crc kubenswrapper[4760]: E0930 09:08:29.629420 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c8bc0ca9746ffdf89d49cb3a9dc813449d610fc7f0de80df4b600f3b3d033b1\": container with ID starting with 9c8bc0ca9746ffdf89d49cb3a9dc813449d610fc7f0de80df4b600f3b3d033b1 not found: ID does not exist" containerID="9c8bc0ca9746ffdf89d49cb3a9dc813449d610fc7f0de80df4b600f3b3d033b1" Sep 30 09:08:29 crc kubenswrapper[4760]: I0930 09:08:29.629481 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c8bc0ca9746ffdf89d49cb3a9dc813449d610fc7f0de80df4b600f3b3d033b1"} err="failed to get container status \"9c8bc0ca9746ffdf89d49cb3a9dc813449d610fc7f0de80df4b600f3b3d033b1\": rpc error: code = NotFound desc = could not find container \"9c8bc0ca9746ffdf89d49cb3a9dc813449d610fc7f0de80df4b600f3b3d033b1\": container with ID starting with 9c8bc0ca9746ffdf89d49cb3a9dc813449d610fc7f0de80df4b600f3b3d033b1 not found: ID does not exist" Sep 30 09:08:29 crc kubenswrapper[4760]: I0930 09:08:29.629511 4760 scope.go:117] "RemoveContainer" containerID="a7df648a2aa78ecb428251e708f19a7763c731494a1e93d2860de8fcefeefdd2" Sep 30 09:08:29 crc kubenswrapper[4760]: E0930 09:08:29.630651 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7df648a2aa78ecb428251e708f19a7763c731494a1e93d2860de8fcefeefdd2\": container with ID starting with a7df648a2aa78ecb428251e708f19a7763c731494a1e93d2860de8fcefeefdd2 not found: ID does not exist" containerID="a7df648a2aa78ecb428251e708f19a7763c731494a1e93d2860de8fcefeefdd2" Sep 30 09:08:29 crc kubenswrapper[4760]: I0930 09:08:29.630706 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7df648a2aa78ecb428251e708f19a7763c731494a1e93d2860de8fcefeefdd2"} err="failed to get container status \"a7df648a2aa78ecb428251e708f19a7763c731494a1e93d2860de8fcefeefdd2\": rpc error: code = NotFound desc = could not find container \"a7df648a2aa78ecb428251e708f19a7763c731494a1e93d2860de8fcefeefdd2\": container with ID starting with a7df648a2aa78ecb428251e708f19a7763c731494a1e93d2860de8fcefeefdd2 not found: ID does not exist" Sep 30 09:08:29 crc kubenswrapper[4760]: I0930 09:08:29.630748 4760 scope.go:117] "RemoveContainer" containerID="135d1d413c0bf008b38db08485153bd8afd9f31c49457438c8d1f895d36c898d" Sep 30 09:08:29 crc kubenswrapper[4760]: E0930 09:08:29.631869 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"135d1d413c0bf008b38db08485153bd8afd9f31c49457438c8d1f895d36c898d\": container with ID starting with 135d1d413c0bf008b38db08485153bd8afd9f31c49457438c8d1f895d36c898d not found: ID does not exist" containerID="135d1d413c0bf008b38db08485153bd8afd9f31c49457438c8d1f895d36c898d" Sep 30 09:08:29 crc kubenswrapper[4760]: I0930 09:08:29.631895 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"135d1d413c0bf008b38db08485153bd8afd9f31c49457438c8d1f895d36c898d"} err="failed to get container status \"135d1d413c0bf008b38db08485153bd8afd9f31c49457438c8d1f895d36c898d\": rpc error: code = NotFound desc = could not find container \"135d1d413c0bf008b38db08485153bd8afd9f31c49457438c8d1f895d36c898d\": container with ID starting with 135d1d413c0bf008b38db08485153bd8afd9f31c49457438c8d1f895d36c898d not found: ID does not exist" Sep 30 09:08:31 crc kubenswrapper[4760]: I0930 09:08:31.069550 4760 scope.go:117] "RemoveContainer" containerID="505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079" Sep 30 09:08:31 crc kubenswrapper[4760]: E0930 09:08:31.070298 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:08:31 crc kubenswrapper[4760]: I0930 09:08:31.082607 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c82793a5-07f2-4daf-9413-8bd85869e93e" path="/var/lib/kubelet/pods/c82793a5-07f2-4daf-9413-8bd85869e93e/volumes" Sep 30 09:08:46 crc kubenswrapper[4760]: I0930 09:08:46.067161 4760 scope.go:117] "RemoveContainer" containerID="505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079" Sep 30 09:08:46 crc kubenswrapper[4760]: E0930 09:08:46.068037 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:09:01 crc kubenswrapper[4760]: I0930 09:09:01.068406 4760 scope.go:117] "RemoveContainer" containerID="505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079" Sep 30 09:09:01 crc kubenswrapper[4760]: E0930 09:09:01.069272 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:09:10 crc kubenswrapper[4760]: I0930 09:09:10.274804 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-65c7cb7cc8-cjqdk_59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3/barbican-api/0.log" Sep 30 09:09:10 crc kubenswrapper[4760]: I0930 09:09:10.337028 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-65c7cb7cc8-cjqdk_59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3/barbican-api-log/0.log" Sep 30 09:09:10 crc kubenswrapper[4760]: I0930 09:09:10.496659 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-554bb7d464-zcqc8_968a427d-0cee-4775-ab7f-4ec27e535b33/barbican-keystone-listener/0.log" Sep 30 09:09:10 crc kubenswrapper[4760]: I0930 09:09:10.584793 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-554bb7d464-zcqc8_968a427d-0cee-4775-ab7f-4ec27e535b33/barbican-keystone-listener-log/0.log" Sep 30 09:09:10 crc kubenswrapper[4760]: I0930 09:09:10.691993 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-8479dd9dbc-25wxx_caf10164-5c77-42df-9fdc-b6a1764a0e3d/barbican-worker/0.log" Sep 30 09:09:10 crc kubenswrapper[4760]: I0930 09:09:10.769608 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-8479dd9dbc-25wxx_caf10164-5c77-42df-9fdc-b6a1764a0e3d/barbican-worker-log/0.log" Sep 30 09:09:10 crc kubenswrapper[4760]: I0930 09:09:10.936189 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6_64e019eb-1763-4e9e-8c00-c4312d782981/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:09:11 crc kubenswrapper[4760]: I0930 09:09:11.144802 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5c70743c-2be6-4c97-aaa8-fe22bd306c7d/ceilometer-central-agent/0.log" Sep 30 09:09:11 crc kubenswrapper[4760]: I0930 09:09:11.169987 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5c70743c-2be6-4c97-aaa8-fe22bd306c7d/ceilometer-notification-agent/0.log" Sep 30 09:09:11 crc kubenswrapper[4760]: I0930 09:09:11.224797 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5c70743c-2be6-4c97-aaa8-fe22bd306c7d/proxy-httpd/0.log" Sep 30 09:09:11 crc kubenswrapper[4760]: I0930 09:09:11.314054 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5c70743c-2be6-4c97-aaa8-fe22bd306c7d/sg-core/0.log" Sep 30 09:09:11 crc kubenswrapper[4760]: I0930 09:09:11.458810 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ce9e61a5-93ea-4bc8-bb73-0578fe123aae/cinder-api/0.log" Sep 30 09:09:11 crc kubenswrapper[4760]: I0930 09:09:11.524049 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ce9e61a5-93ea-4bc8-bb73-0578fe123aae/cinder-api-log/0.log" Sep 30 09:09:11 crc kubenswrapper[4760]: I0930 09:09:11.688607 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d3817395-40ad-472b-b4df-83a7386bb16f/cinder-scheduler/0.log" Sep 30 09:09:11 crc kubenswrapper[4760]: I0930 09:09:11.778477 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d3817395-40ad-472b-b4df-83a7386bb16f/probe/0.log" Sep 30 09:09:11 crc kubenswrapper[4760]: I0930 09:09:11.899892 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4_e2439cde-d5f2-423a-9e6d-4af8d713c917/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:09:12 crc kubenswrapper[4760]: I0930 09:09:12.067540 4760 scope.go:117] "RemoveContainer" containerID="505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079" Sep 30 09:09:12 crc kubenswrapper[4760]: E0930 09:09:12.067803 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:09:12 crc kubenswrapper[4760]: I0930 09:09:12.133179 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v_fc1682c5-7e4d-43a1-89f4-b40761683742/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:09:12 crc kubenswrapper[4760]: I0930 09:09:12.267230 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7c8665b49f-cp9sh_5e0ff1e1-cda6-4574-a353-f4a7406326e7/init/0.log" Sep 30 09:09:12 crc kubenswrapper[4760]: I0930 09:09:12.449815 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7c8665b49f-cp9sh_5e0ff1e1-cda6-4574-a353-f4a7406326e7/init/0.log" Sep 30 09:09:12 crc kubenswrapper[4760]: I0930 09:09:12.755646 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd_87fc7cca-6571-4e27-ab1e-14648064566e/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:09:12 crc kubenswrapper[4760]: I0930 09:09:12.779381 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7c8665b49f-cp9sh_5e0ff1e1-cda6-4574-a353-f4a7406326e7/dnsmasq-dns/0.log" Sep 30 09:09:12 crc kubenswrapper[4760]: I0930 09:09:12.954410 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_19ddce20-85ae-4537-86f4-33a6b35fef0b/glance-httpd/0.log" Sep 30 09:09:12 crc kubenswrapper[4760]: I0930 09:09:12.997611 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_19ddce20-85ae-4537-86f4-33a6b35fef0b/glance-log/0.log" Sep 30 09:09:13 crc kubenswrapper[4760]: I0930 09:09:13.192721 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_eec5d0ef-04f3-4a34-8575-45e2a88c519f/glance-httpd/0.log" Sep 30 09:09:13 crc kubenswrapper[4760]: I0930 09:09:13.257325 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_eec5d0ef-04f3-4a34-8575-45e2a88c519f/glance-log/0.log" Sep 30 09:09:13 crc kubenswrapper[4760]: I0930 09:09:13.483183 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-75644c8bb4-wrsmv_8b39ba3e-25df-4a22-a1fe-f15e6ca1fada/horizon/0.log" Sep 30 09:09:13 crc kubenswrapper[4760]: I0930 09:09:13.726158 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v_e0989b93-a567-4aa1-886e-43b6fa827891/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:09:13 crc kubenswrapper[4760]: I0930 09:09:13.925284 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-zxgfv_23f502d4-3801-4388-b442-22f60146dcf2/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:09:14 crc kubenswrapper[4760]: I0930 09:09:14.130218 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-75644c8bb4-wrsmv_8b39ba3e-25df-4a22-a1fe-f15e6ca1fada/horizon-log/0.log" Sep 30 09:09:14 crc kubenswrapper[4760]: I0930 09:09:14.259857 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29320321-w9cp7_2a6d4144-48a7-412d-9288-a909f1fbd5f4/keystone-cron/0.log" Sep 30 09:09:14 crc kubenswrapper[4760]: I0930 09:09:14.337646 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29320381-pcwcm_aa755a39-f3ae-49a4-80ce-a0efcfe2566e/keystone-cron/0.log" Sep 30 09:09:14 crc kubenswrapper[4760]: I0930 09:09:14.442584 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-77987c8bb7-t2mw2_9d49cf7a-b821-4677-88fe-8fac1dbced63/keystone-api/0.log" Sep 30 09:09:14 crc kubenswrapper[4760]: I0930 09:09:14.744831 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_4fc405db-b61d-4077-b14d-ef2b4eea924c/kube-state-metrics/0.log" Sep 30 09:09:14 crc kubenswrapper[4760]: I0930 09:09:14.792616 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-s99vh_d904db1f-5f11-47d3-8823-ff59f4bed296/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:09:15 crc kubenswrapper[4760]: I0930 09:09:15.409716 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-55ffd7b5b9-x7zhf_6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2/neutron-httpd/0.log" Sep 30 09:09:15 crc kubenswrapper[4760]: I0930 09:09:15.422538 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-55ffd7b5b9-x7zhf_6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2/neutron-api/0.log" Sep 30 09:09:15 crc kubenswrapper[4760]: I0930 09:09:15.494909 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4_5259c092-63b5-4574-b14a-725c45523773/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:09:16 crc kubenswrapper[4760]: I0930 09:09:16.424014 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_d1669957-f301-409c-8f6b-e1b87dfadeb7/nova-cell0-conductor-conductor/0.log" Sep 30 09:09:17 crc kubenswrapper[4760]: I0930 09:09:17.047119 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_998c9109-185a-425a-bda1-12eb13c83ca7/nova-cell1-conductor-conductor/0.log" Sep 30 09:09:17 crc kubenswrapper[4760]: I0930 09:09:17.095150 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_c27b43ce-27fb-4163-b55a-98a7e9ee7d71/nova-api-log/0.log" Sep 30 09:09:17 crc kubenswrapper[4760]: I0930 09:09:17.388032 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_c27b43ce-27fb-4163-b55a-98a7e9ee7d71/nova-api-api/0.log" Sep 30 09:09:17 crc kubenswrapper[4760]: I0930 09:09:17.444580 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_f7ab1b57-8aaa-4360-b024-fa2142ebd994/nova-cell1-novncproxy-novncproxy/0.log" Sep 30 09:09:17 crc kubenswrapper[4760]: I0930 09:09:17.732770 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-5ncgd_33cc4d6c-b086-410c-b38e-f6c918657a74/nova-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:09:17 crc kubenswrapper[4760]: I0930 09:09:17.876194 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_10427d42-4cfc-486e-931c-fd62a2a5b1e5/nova-metadata-log/0.log" Sep 30 09:09:18 crc kubenswrapper[4760]: I0930 09:09:18.486839 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_868868bd-3879-4d24-9dd1-62218a15844c/nova-scheduler-scheduler/0.log" Sep 30 09:09:18 crc kubenswrapper[4760]: I0930 09:09:18.489652 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_894abb89-f647-4143-904c-88b5108982cd/mysql-bootstrap/0.log" Sep 30 09:09:18 crc kubenswrapper[4760]: I0930 09:09:18.751008 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_894abb89-f647-4143-904c-88b5108982cd/mysql-bootstrap/0.log" Sep 30 09:09:18 crc kubenswrapper[4760]: I0930 09:09:18.751585 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_894abb89-f647-4143-904c-88b5108982cd/galera/0.log" Sep 30 09:09:18 crc kubenswrapper[4760]: I0930 09:09:18.985845 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_641818bf-a81e-4654-a8f7-c8d06fbefc6c/mysql-bootstrap/0.log" Sep 30 09:09:19 crc kubenswrapper[4760]: I0930 09:09:19.255880 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_641818bf-a81e-4654-a8f7-c8d06fbefc6c/mysql-bootstrap/0.log" Sep 30 09:09:19 crc kubenswrapper[4760]: I0930 09:09:19.270993 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_641818bf-a81e-4654-a8f7-c8d06fbefc6c/galera/0.log" Sep 30 09:09:19 crc kubenswrapper[4760]: I0930 09:09:19.510105 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_1c5867a3-c734-489e-a6b3-edb023949556/openstackclient/0.log" Sep 30 09:09:19 crc kubenswrapper[4760]: I0930 09:09:19.729068 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-56wgh_159ee554-1b62-4fe3-95c6-e64ab0c58b2d/ovn-controller/0.log" Sep 30 09:09:19 crc kubenswrapper[4760]: I0930 09:09:19.951406 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-k6hgt_83918139-1a35-439f-8f7c-cd46d6e21064/openstack-network-exporter/0.log" Sep 30 09:09:20 crc kubenswrapper[4760]: I0930 09:09:20.195006 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bwrv9_8735bd7c-231f-47df-a404-b8cab84f0d7b/ovsdb-server-init/0.log" Sep 30 09:09:20 crc kubenswrapper[4760]: I0930 09:09:20.399097 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_10427d42-4cfc-486e-931c-fd62a2a5b1e5/nova-metadata-metadata/0.log" Sep 30 09:09:20 crc kubenswrapper[4760]: I0930 09:09:20.437653 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bwrv9_8735bd7c-231f-47df-a404-b8cab84f0d7b/ovs-vswitchd/0.log" Sep 30 09:09:20 crc kubenswrapper[4760]: I0930 09:09:20.440002 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bwrv9_8735bd7c-231f-47df-a404-b8cab84f0d7b/ovsdb-server-init/0.log" Sep 30 09:09:20 crc kubenswrapper[4760]: I0930 09:09:20.607282 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bwrv9_8735bd7c-231f-47df-a404-b8cab84f0d7b/ovsdb-server/0.log" Sep 30 09:09:20 crc kubenswrapper[4760]: I0930 09:09:20.823091 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-69ht5_0f077fda-e7af-42a5-9d0b-f007910f6948/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:09:20 crc kubenswrapper[4760]: I0930 09:09:20.927679 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_96ee48d1-c16e-4367-9159-0f9ddaf5e66a/openstack-network-exporter/0.log" Sep 30 09:09:21 crc kubenswrapper[4760]: I0930 09:09:21.036531 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_96ee48d1-c16e-4367-9159-0f9ddaf5e66a/ovn-northd/0.log" Sep 30 09:09:21 crc kubenswrapper[4760]: I0930 09:09:21.157611 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d25b3b00-98d6-4bfc-8218-9ea7319e1c60/openstack-network-exporter/0.log" Sep 30 09:09:21 crc kubenswrapper[4760]: I0930 09:09:21.254804 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d25b3b00-98d6-4bfc-8218-9ea7319e1c60/ovsdbserver-nb/0.log" Sep 30 09:09:21 crc kubenswrapper[4760]: I0930 09:09:21.350035 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d2f7adda-d2ed-4c87-8e63-64e344155305/openstack-network-exporter/0.log" Sep 30 09:09:21 crc kubenswrapper[4760]: I0930 09:09:21.470932 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d2f7adda-d2ed-4c87-8e63-64e344155305/ovsdbserver-sb/0.log" Sep 30 09:09:21 crc kubenswrapper[4760]: I0930 09:09:21.759944 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-844b758db4-hzncj_a6cfc37b-8ee0-4efe-a43f-b53bafbf4255/placement-api/0.log" Sep 30 09:09:21 crc kubenswrapper[4760]: I0930 09:09:21.977388 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-844b758db4-hzncj_a6cfc37b-8ee0-4efe-a43f-b53bafbf4255/placement-log/0.log" Sep 30 09:09:22 crc kubenswrapper[4760]: I0930 09:09:22.023070 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_33ff1fe6-1a25-49ab-8b11-97ab06ee2e43/init-config-reloader/0.log" Sep 30 09:09:22 crc kubenswrapper[4760]: I0930 09:09:22.155077 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_33ff1fe6-1a25-49ab-8b11-97ab06ee2e43/init-config-reloader/0.log" Sep 30 09:09:22 crc kubenswrapper[4760]: I0930 09:09:22.200684 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_33ff1fe6-1a25-49ab-8b11-97ab06ee2e43/config-reloader/0.log" Sep 30 09:09:22 crc kubenswrapper[4760]: I0930 09:09:22.243947 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_33ff1fe6-1a25-49ab-8b11-97ab06ee2e43/prometheus/0.log" Sep 30 09:09:22 crc kubenswrapper[4760]: I0930 09:09:22.384983 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_33ff1fe6-1a25-49ab-8b11-97ab06ee2e43/thanos-sidecar/0.log" Sep 30 09:09:22 crc kubenswrapper[4760]: I0930 09:09:22.473032 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_2750016d-97a4-4e2b-a0e8-a03ddd6d64bb/setup-container/0.log" Sep 30 09:09:22 crc kubenswrapper[4760]: I0930 09:09:22.780953 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_2750016d-97a4-4e2b-a0e8-a03ddd6d64bb/setup-container/0.log" Sep 30 09:09:22 crc kubenswrapper[4760]: I0930 09:09:22.814938 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_2750016d-97a4-4e2b-a0e8-a03ddd6d64bb/rabbitmq/0.log" Sep 30 09:09:23 crc kubenswrapper[4760]: I0930 09:09:23.067224 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac/setup-container/0.log" Sep 30 09:09:23 crc kubenswrapper[4760]: I0930 09:09:23.221542 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac/setup-container/0.log" Sep 30 09:09:23 crc kubenswrapper[4760]: I0930 09:09:23.251635 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac/rabbitmq/0.log" Sep 30 09:09:23 crc kubenswrapper[4760]: I0930 09:09:23.489844 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs_9efadf79-7f8c-4a83-9788-6f4f0a5ecd77/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:09:23 crc kubenswrapper[4760]: I0930 09:09:23.597081 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-wld9k_36b213a9-6e12-4215-be85-b1a0c647558f/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:09:23 crc kubenswrapper[4760]: I0930 09:09:23.696238 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj_65500975-80f6-4dae-a528-33950d370831/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:09:23 crc kubenswrapper[4760]: I0930 09:09:23.962738 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-vznk5_820e5332-bfcf-4cca-8079-e3d26cc62517/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:09:24 crc kubenswrapper[4760]: I0930 09:09:24.091533 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-m7lks_00487f96-583f-4ae8-bd0d-7fb932d86feb/ssh-known-hosts-edpm-deployment/0.log" Sep 30 09:09:24 crc kubenswrapper[4760]: I0930 09:09:24.346742 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6cc97c56c5-7pkjn_fc788440-e748-4b41-bdb6-23a6764062fd/proxy-server/0.log" Sep 30 09:09:24 crc kubenswrapper[4760]: I0930 09:09:24.466971 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6cc97c56c5-7pkjn_fc788440-e748-4b41-bdb6-23a6764062fd/proxy-httpd/0.log" Sep 30 09:09:24 crc kubenswrapper[4760]: I0930 09:09:24.567861 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-4rxf2_cfaf0e86-3b68-4e5c-8caf-c60518a28016/swift-ring-rebalance/0.log" Sep 30 09:09:24 crc kubenswrapper[4760]: I0930 09:09:24.699636 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/account-auditor/0.log" Sep 30 09:09:24 crc kubenswrapper[4760]: I0930 09:09:24.870669 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/account-reaper/0.log" Sep 30 09:09:24 crc kubenswrapper[4760]: I0930 09:09:24.969282 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/account-server/0.log" Sep 30 09:09:24 crc kubenswrapper[4760]: I0930 09:09:24.982098 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/account-replicator/0.log" Sep 30 09:09:25 crc kubenswrapper[4760]: I0930 09:09:25.096159 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/container-auditor/0.log" Sep 30 09:09:25 crc kubenswrapper[4760]: I0930 09:09:25.180443 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/container-server/0.log" Sep 30 09:09:25 crc kubenswrapper[4760]: I0930 09:09:25.275363 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/container-replicator/0.log" Sep 30 09:09:25 crc kubenswrapper[4760]: I0930 09:09:25.334602 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/container-updater/0.log" Sep 30 09:09:25 crc kubenswrapper[4760]: I0930 09:09:25.408728 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/object-auditor/0.log" Sep 30 09:09:25 crc kubenswrapper[4760]: I0930 09:09:25.517602 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/object-expirer/0.log" Sep 30 09:09:25 crc kubenswrapper[4760]: I0930 09:09:25.564675 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/object-replicator/0.log" Sep 30 09:09:25 crc kubenswrapper[4760]: I0930 09:09:25.657180 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/object-server/0.log" Sep 30 09:09:25 crc kubenswrapper[4760]: I0930 09:09:25.722456 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/object-updater/0.log" Sep 30 09:09:25 crc kubenswrapper[4760]: I0930 09:09:25.821104 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/rsync/0.log" Sep 30 09:09:25 crc kubenswrapper[4760]: I0930 09:09:25.910835 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/swift-recon-cron/0.log" Sep 30 09:09:26 crc kubenswrapper[4760]: I0930 09:09:26.071643 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g_9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:09:26 crc kubenswrapper[4760]: I0930 09:09:26.228727 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_394b8542-fe18-475f-9374-ce5c7e3820e7/tempest-tests-tempest-tests-runner/0.log" Sep 30 09:09:26 crc kubenswrapper[4760]: I0930 09:09:26.348216 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_bb327f16-8c82-4829-b400-a5917094069f/test-operator-logs-container/0.log" Sep 30 09:09:26 crc kubenswrapper[4760]: I0930 09:09:26.606374 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r_daecc10f-5930-44cc-806b-95012b47df8a/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:09:27 crc kubenswrapper[4760]: I0930 09:09:27.067064 4760 scope.go:117] "RemoveContainer" containerID="505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079" Sep 30 09:09:27 crc kubenswrapper[4760]: E0930 09:09:27.067455 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:09:27 crc kubenswrapper[4760]: I0930 09:09:27.557287 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_7c12cbe5-a6fb-4ead-bb65-cd13dab410ce/watcher-applier/0.log" Sep 30 09:09:27 crc kubenswrapper[4760]: I0930 09:09:27.889114 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_79c24341-f615-4dcf-818f-e1c398e2504d/watcher-api-log/0.log" Sep 30 09:09:29 crc kubenswrapper[4760]: I0930 09:09:29.346559 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_bd8762e1-d4b3-4999-996e-db79b881afec/watcher-decision-engine/0.log" Sep 30 09:09:31 crc kubenswrapper[4760]: I0930 09:09:31.743879 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_79c24341-f615-4dcf-818f-e1c398e2504d/watcher-api/0.log" Sep 30 09:09:32 crc kubenswrapper[4760]: I0930 09:09:32.650490 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_598c1476-b9fa-48c1-a346-80e23448d00f/memcached/0.log" Sep 30 09:09:40 crc kubenswrapper[4760]: I0930 09:09:40.067773 4760 scope.go:117] "RemoveContainer" containerID="505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079" Sep 30 09:09:40 crc kubenswrapper[4760]: E0930 09:09:40.068665 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:09:55 crc kubenswrapper[4760]: I0930 09:09:55.079751 4760 scope.go:117] "RemoveContainer" containerID="505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079" Sep 30 09:09:55 crc kubenswrapper[4760]: E0930 09:09:55.080519 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:10:05 crc kubenswrapper[4760]: I0930 09:10:05.440558 4760 generic.go:334] "Generic (PLEG): container finished" podID="a7326a8e-4da0-47cf-bfd3-d9084fb32702" containerID="b4f41e3bc512b3212ea9f8e41aadbc661dd84f47ad67be4147ac357a8717711a" exitCode=0 Sep 30 09:10:05 crc kubenswrapper[4760]: I0930 09:10:05.440681 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sjdgg/crc-debug-tzvbh" event={"ID":"a7326a8e-4da0-47cf-bfd3-d9084fb32702","Type":"ContainerDied","Data":"b4f41e3bc512b3212ea9f8e41aadbc661dd84f47ad67be4147ac357a8717711a"} Sep 30 09:10:06 crc kubenswrapper[4760]: I0930 09:10:06.560537 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sjdgg/crc-debug-tzvbh" Sep 30 09:10:06 crc kubenswrapper[4760]: I0930 09:10:06.596114 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-sjdgg/crc-debug-tzvbh"] Sep 30 09:10:06 crc kubenswrapper[4760]: I0930 09:10:06.603704 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-sjdgg/crc-debug-tzvbh"] Sep 30 09:10:06 crc kubenswrapper[4760]: I0930 09:10:06.708317 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wvhj\" (UniqueName: \"kubernetes.io/projected/a7326a8e-4da0-47cf-bfd3-d9084fb32702-kube-api-access-9wvhj\") pod \"a7326a8e-4da0-47cf-bfd3-d9084fb32702\" (UID: \"a7326a8e-4da0-47cf-bfd3-d9084fb32702\") " Sep 30 09:10:06 crc kubenswrapper[4760]: I0930 09:10:06.708407 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a7326a8e-4da0-47cf-bfd3-d9084fb32702-host\") pod \"a7326a8e-4da0-47cf-bfd3-d9084fb32702\" (UID: \"a7326a8e-4da0-47cf-bfd3-d9084fb32702\") " Sep 30 09:10:06 crc kubenswrapper[4760]: I0930 09:10:06.708591 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7326a8e-4da0-47cf-bfd3-d9084fb32702-host" (OuterVolumeSpecName: "host") pod "a7326a8e-4da0-47cf-bfd3-d9084fb32702" (UID: "a7326a8e-4da0-47cf-bfd3-d9084fb32702"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 09:10:06 crc kubenswrapper[4760]: I0930 09:10:06.708926 4760 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a7326a8e-4da0-47cf-bfd3-d9084fb32702-host\") on node \"crc\" DevicePath \"\"" Sep 30 09:10:06 crc kubenswrapper[4760]: I0930 09:10:06.713501 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7326a8e-4da0-47cf-bfd3-d9084fb32702-kube-api-access-9wvhj" (OuterVolumeSpecName: "kube-api-access-9wvhj") pod "a7326a8e-4da0-47cf-bfd3-d9084fb32702" (UID: "a7326a8e-4da0-47cf-bfd3-d9084fb32702"). InnerVolumeSpecName "kube-api-access-9wvhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 09:10:06 crc kubenswrapper[4760]: I0930 09:10:06.810772 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wvhj\" (UniqueName: \"kubernetes.io/projected/a7326a8e-4da0-47cf-bfd3-d9084fb32702-kube-api-access-9wvhj\") on node \"crc\" DevicePath \"\"" Sep 30 09:10:07 crc kubenswrapper[4760]: I0930 09:10:07.084179 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7326a8e-4da0-47cf-bfd3-d9084fb32702" path="/var/lib/kubelet/pods/a7326a8e-4da0-47cf-bfd3-d9084fb32702/volumes" Sep 30 09:10:07 crc kubenswrapper[4760]: I0930 09:10:07.460621 4760 scope.go:117] "RemoveContainer" containerID="b4f41e3bc512b3212ea9f8e41aadbc661dd84f47ad67be4147ac357a8717711a" Sep 30 09:10:07 crc kubenswrapper[4760]: I0930 09:10:07.460647 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sjdgg/crc-debug-tzvbh" Sep 30 09:10:07 crc kubenswrapper[4760]: I0930 09:10:07.913032 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sjdgg/crc-debug-qcvjz"] Sep 30 09:10:07 crc kubenswrapper[4760]: E0930 09:10:07.913416 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="877b4e69-9337-48d0-b822-933a3a44a869" containerName="extract-utilities" Sep 30 09:10:07 crc kubenswrapper[4760]: I0930 09:10:07.913430 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="877b4e69-9337-48d0-b822-933a3a44a869" containerName="extract-utilities" Sep 30 09:10:07 crc kubenswrapper[4760]: E0930 09:10:07.913440 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c82793a5-07f2-4daf-9413-8bd85869e93e" containerName="extract-content" Sep 30 09:10:07 crc kubenswrapper[4760]: I0930 09:10:07.913445 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="c82793a5-07f2-4daf-9413-8bd85869e93e" containerName="extract-content" Sep 30 09:10:07 crc kubenswrapper[4760]: E0930 09:10:07.913461 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7326a8e-4da0-47cf-bfd3-d9084fb32702" containerName="container-00" Sep 30 09:10:07 crc kubenswrapper[4760]: I0930 09:10:07.913467 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7326a8e-4da0-47cf-bfd3-d9084fb32702" containerName="container-00" Sep 30 09:10:07 crc kubenswrapper[4760]: E0930 09:10:07.913479 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c82793a5-07f2-4daf-9413-8bd85869e93e" containerName="extract-utilities" Sep 30 09:10:07 crc kubenswrapper[4760]: I0930 09:10:07.913485 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="c82793a5-07f2-4daf-9413-8bd85869e93e" containerName="extract-utilities" Sep 30 09:10:07 crc kubenswrapper[4760]: E0930 09:10:07.913501 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="877b4e69-9337-48d0-b822-933a3a44a869" containerName="extract-content" Sep 30 09:10:07 crc kubenswrapper[4760]: I0930 09:10:07.913507 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="877b4e69-9337-48d0-b822-933a3a44a869" containerName="extract-content" Sep 30 09:10:07 crc kubenswrapper[4760]: E0930 09:10:07.913527 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="877b4e69-9337-48d0-b822-933a3a44a869" containerName="registry-server" Sep 30 09:10:07 crc kubenswrapper[4760]: I0930 09:10:07.913532 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="877b4e69-9337-48d0-b822-933a3a44a869" containerName="registry-server" Sep 30 09:10:07 crc kubenswrapper[4760]: E0930 09:10:07.913545 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c82793a5-07f2-4daf-9413-8bd85869e93e" containerName="registry-server" Sep 30 09:10:07 crc kubenswrapper[4760]: I0930 09:10:07.913550 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="c82793a5-07f2-4daf-9413-8bd85869e93e" containerName="registry-server" Sep 30 09:10:07 crc kubenswrapper[4760]: I0930 09:10:07.913744 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="c82793a5-07f2-4daf-9413-8bd85869e93e" containerName="registry-server" Sep 30 09:10:07 crc kubenswrapper[4760]: I0930 09:10:07.913770 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="877b4e69-9337-48d0-b822-933a3a44a869" containerName="registry-server" Sep 30 09:10:07 crc kubenswrapper[4760]: I0930 09:10:07.913780 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7326a8e-4da0-47cf-bfd3-d9084fb32702" containerName="container-00" Sep 30 09:10:07 crc kubenswrapper[4760]: I0930 09:10:07.914476 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sjdgg/crc-debug-qcvjz" Sep 30 09:10:08 crc kubenswrapper[4760]: I0930 09:10:08.035149 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hmqx\" (UniqueName: \"kubernetes.io/projected/0c10558b-c5cb-4a75-b650-aab68521da11-kube-api-access-4hmqx\") pod \"crc-debug-qcvjz\" (UID: \"0c10558b-c5cb-4a75-b650-aab68521da11\") " pod="openshift-must-gather-sjdgg/crc-debug-qcvjz" Sep 30 09:10:08 crc kubenswrapper[4760]: I0930 09:10:08.035206 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c10558b-c5cb-4a75-b650-aab68521da11-host\") pod \"crc-debug-qcvjz\" (UID: \"0c10558b-c5cb-4a75-b650-aab68521da11\") " pod="openshift-must-gather-sjdgg/crc-debug-qcvjz" Sep 30 09:10:08 crc kubenswrapper[4760]: I0930 09:10:08.066411 4760 scope.go:117] "RemoveContainer" containerID="505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079" Sep 30 09:10:08 crc kubenswrapper[4760]: E0930 09:10:08.066651 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:10:08 crc kubenswrapper[4760]: I0930 09:10:08.136798 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hmqx\" (UniqueName: \"kubernetes.io/projected/0c10558b-c5cb-4a75-b650-aab68521da11-kube-api-access-4hmqx\") pod \"crc-debug-qcvjz\" (UID: \"0c10558b-c5cb-4a75-b650-aab68521da11\") " pod="openshift-must-gather-sjdgg/crc-debug-qcvjz" Sep 30 09:10:08 crc kubenswrapper[4760]: I0930 09:10:08.136867 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c10558b-c5cb-4a75-b650-aab68521da11-host\") pod \"crc-debug-qcvjz\" (UID: \"0c10558b-c5cb-4a75-b650-aab68521da11\") " pod="openshift-must-gather-sjdgg/crc-debug-qcvjz" Sep 30 09:10:08 crc kubenswrapper[4760]: I0930 09:10:08.136999 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c10558b-c5cb-4a75-b650-aab68521da11-host\") pod \"crc-debug-qcvjz\" (UID: \"0c10558b-c5cb-4a75-b650-aab68521da11\") " pod="openshift-must-gather-sjdgg/crc-debug-qcvjz" Sep 30 09:10:08 crc kubenswrapper[4760]: I0930 09:10:08.161970 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hmqx\" (UniqueName: \"kubernetes.io/projected/0c10558b-c5cb-4a75-b650-aab68521da11-kube-api-access-4hmqx\") pod \"crc-debug-qcvjz\" (UID: \"0c10558b-c5cb-4a75-b650-aab68521da11\") " pod="openshift-must-gather-sjdgg/crc-debug-qcvjz" Sep 30 09:10:08 crc kubenswrapper[4760]: I0930 09:10:08.238784 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sjdgg/crc-debug-qcvjz" Sep 30 09:10:08 crc kubenswrapper[4760]: I0930 09:10:08.490182 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sjdgg/crc-debug-qcvjz" event={"ID":"0c10558b-c5cb-4a75-b650-aab68521da11","Type":"ContainerStarted","Data":"4225a012921d3ad23b9209256cf9daca2a479561f16d2956e3662d88be3ffad3"} Sep 30 09:10:09 crc kubenswrapper[4760]: I0930 09:10:09.504814 4760 generic.go:334] "Generic (PLEG): container finished" podID="0c10558b-c5cb-4a75-b650-aab68521da11" containerID="de32625d893599ea58e25f401caafe89e732c3ca86bcbe693edd4cce6be6572e" exitCode=0 Sep 30 09:10:09 crc kubenswrapper[4760]: I0930 09:10:09.505214 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sjdgg/crc-debug-qcvjz" event={"ID":"0c10558b-c5cb-4a75-b650-aab68521da11","Type":"ContainerDied","Data":"de32625d893599ea58e25f401caafe89e732c3ca86bcbe693edd4cce6be6572e"} Sep 30 09:10:10 crc kubenswrapper[4760]: I0930 09:10:10.623593 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sjdgg/crc-debug-qcvjz" Sep 30 09:10:10 crc kubenswrapper[4760]: I0930 09:10:10.688051 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hmqx\" (UniqueName: \"kubernetes.io/projected/0c10558b-c5cb-4a75-b650-aab68521da11-kube-api-access-4hmqx\") pod \"0c10558b-c5cb-4a75-b650-aab68521da11\" (UID: \"0c10558b-c5cb-4a75-b650-aab68521da11\") " Sep 30 09:10:10 crc kubenswrapper[4760]: I0930 09:10:10.688167 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c10558b-c5cb-4a75-b650-aab68521da11-host\") pod \"0c10558b-c5cb-4a75-b650-aab68521da11\" (UID: \"0c10558b-c5cb-4a75-b650-aab68521da11\") " Sep 30 09:10:10 crc kubenswrapper[4760]: I0930 09:10:10.688288 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c10558b-c5cb-4a75-b650-aab68521da11-host" (OuterVolumeSpecName: "host") pod "0c10558b-c5cb-4a75-b650-aab68521da11" (UID: "0c10558b-c5cb-4a75-b650-aab68521da11"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 09:10:10 crc kubenswrapper[4760]: I0930 09:10:10.688702 4760 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c10558b-c5cb-4a75-b650-aab68521da11-host\") on node \"crc\" DevicePath \"\"" Sep 30 09:10:10 crc kubenswrapper[4760]: I0930 09:10:10.694046 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c10558b-c5cb-4a75-b650-aab68521da11-kube-api-access-4hmqx" (OuterVolumeSpecName: "kube-api-access-4hmqx") pod "0c10558b-c5cb-4a75-b650-aab68521da11" (UID: "0c10558b-c5cb-4a75-b650-aab68521da11"). InnerVolumeSpecName "kube-api-access-4hmqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 09:10:10 crc kubenswrapper[4760]: I0930 09:10:10.790262 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hmqx\" (UniqueName: \"kubernetes.io/projected/0c10558b-c5cb-4a75-b650-aab68521da11-kube-api-access-4hmqx\") on node \"crc\" DevicePath \"\"" Sep 30 09:10:11 crc kubenswrapper[4760]: I0930 09:10:11.527520 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sjdgg/crc-debug-qcvjz" event={"ID":"0c10558b-c5cb-4a75-b650-aab68521da11","Type":"ContainerDied","Data":"4225a012921d3ad23b9209256cf9daca2a479561f16d2956e3662d88be3ffad3"} Sep 30 09:10:11 crc kubenswrapper[4760]: I0930 09:10:11.527572 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sjdgg/crc-debug-qcvjz" Sep 30 09:10:11 crc kubenswrapper[4760]: I0930 09:10:11.527580 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4225a012921d3ad23b9209256cf9daca2a479561f16d2956e3662d88be3ffad3" Sep 30 09:10:19 crc kubenswrapper[4760]: I0930 09:10:19.034796 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-sjdgg/crc-debug-qcvjz"] Sep 30 09:10:19 crc kubenswrapper[4760]: I0930 09:10:19.042122 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-sjdgg/crc-debug-qcvjz"] Sep 30 09:10:19 crc kubenswrapper[4760]: I0930 09:10:19.077711 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c10558b-c5cb-4a75-b650-aab68521da11" path="/var/lib/kubelet/pods/0c10558b-c5cb-4a75-b650-aab68521da11/volumes" Sep 30 09:10:20 crc kubenswrapper[4760]: I0930 09:10:20.221732 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sjdgg/crc-debug-vbrjq"] Sep 30 09:10:20 crc kubenswrapper[4760]: E0930 09:10:20.222463 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c10558b-c5cb-4a75-b650-aab68521da11" containerName="container-00" Sep 30 09:10:20 crc kubenswrapper[4760]: I0930 09:10:20.222477 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c10558b-c5cb-4a75-b650-aab68521da11" containerName="container-00" Sep 30 09:10:20 crc kubenswrapper[4760]: I0930 09:10:20.222710 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c10558b-c5cb-4a75-b650-aab68521da11" containerName="container-00" Sep 30 09:10:20 crc kubenswrapper[4760]: I0930 09:10:20.223397 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sjdgg/crc-debug-vbrjq" Sep 30 09:10:20 crc kubenswrapper[4760]: I0930 09:10:20.267041 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04e4c535-14d9-499c-bcee-1b1c805d3078-host\") pod \"crc-debug-vbrjq\" (UID: \"04e4c535-14d9-499c-bcee-1b1c805d3078\") " pod="openshift-must-gather-sjdgg/crc-debug-vbrjq" Sep 30 09:10:20 crc kubenswrapper[4760]: I0930 09:10:20.267142 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5tgj\" (UniqueName: \"kubernetes.io/projected/04e4c535-14d9-499c-bcee-1b1c805d3078-kube-api-access-s5tgj\") pod \"crc-debug-vbrjq\" (UID: \"04e4c535-14d9-499c-bcee-1b1c805d3078\") " pod="openshift-must-gather-sjdgg/crc-debug-vbrjq" Sep 30 09:10:20 crc kubenswrapper[4760]: I0930 09:10:20.368525 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5tgj\" (UniqueName: \"kubernetes.io/projected/04e4c535-14d9-499c-bcee-1b1c805d3078-kube-api-access-s5tgj\") pod \"crc-debug-vbrjq\" (UID: \"04e4c535-14d9-499c-bcee-1b1c805d3078\") " pod="openshift-must-gather-sjdgg/crc-debug-vbrjq" Sep 30 09:10:20 crc kubenswrapper[4760]: I0930 09:10:20.368678 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04e4c535-14d9-499c-bcee-1b1c805d3078-host\") pod \"crc-debug-vbrjq\" (UID: \"04e4c535-14d9-499c-bcee-1b1c805d3078\") " pod="openshift-must-gather-sjdgg/crc-debug-vbrjq" Sep 30 09:10:20 crc kubenswrapper[4760]: I0930 09:10:20.368852 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04e4c535-14d9-499c-bcee-1b1c805d3078-host\") pod \"crc-debug-vbrjq\" (UID: \"04e4c535-14d9-499c-bcee-1b1c805d3078\") " pod="openshift-must-gather-sjdgg/crc-debug-vbrjq" Sep 30 09:10:20 crc kubenswrapper[4760]: I0930 09:10:20.390255 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5tgj\" (UniqueName: \"kubernetes.io/projected/04e4c535-14d9-499c-bcee-1b1c805d3078-kube-api-access-s5tgj\") pod \"crc-debug-vbrjq\" (UID: \"04e4c535-14d9-499c-bcee-1b1c805d3078\") " pod="openshift-must-gather-sjdgg/crc-debug-vbrjq" Sep 30 09:10:20 crc kubenswrapper[4760]: I0930 09:10:20.543807 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sjdgg/crc-debug-vbrjq" Sep 30 09:10:20 crc kubenswrapper[4760]: W0930 09:10:20.608421 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04e4c535_14d9_499c_bcee_1b1c805d3078.slice/crio-eed6e6aae3e64917e31c12593d29e31402c19bd3af21a7ca46decab77f87d63a WatchSource:0}: Error finding container eed6e6aae3e64917e31c12593d29e31402c19bd3af21a7ca46decab77f87d63a: Status 404 returned error can't find the container with id eed6e6aae3e64917e31c12593d29e31402c19bd3af21a7ca46decab77f87d63a Sep 30 09:10:21 crc kubenswrapper[4760]: I0930 09:10:21.624133 4760 generic.go:334] "Generic (PLEG): container finished" podID="04e4c535-14d9-499c-bcee-1b1c805d3078" containerID="832354125c7faf0e990f5fa40f4f6b9cf125b638e35a83c94b475e4bc1fd9dcd" exitCode=0 Sep 30 09:10:21 crc kubenswrapper[4760]: I0930 09:10:21.624252 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sjdgg/crc-debug-vbrjq" event={"ID":"04e4c535-14d9-499c-bcee-1b1c805d3078","Type":"ContainerDied","Data":"832354125c7faf0e990f5fa40f4f6b9cf125b638e35a83c94b475e4bc1fd9dcd"} Sep 30 09:10:21 crc kubenswrapper[4760]: I0930 09:10:21.625994 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sjdgg/crc-debug-vbrjq" event={"ID":"04e4c535-14d9-499c-bcee-1b1c805d3078","Type":"ContainerStarted","Data":"eed6e6aae3e64917e31c12593d29e31402c19bd3af21a7ca46decab77f87d63a"} Sep 30 09:10:21 crc kubenswrapper[4760]: I0930 09:10:21.727237 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-sjdgg/crc-debug-vbrjq"] Sep 30 09:10:21 crc kubenswrapper[4760]: I0930 09:10:21.737905 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-sjdgg/crc-debug-vbrjq"] Sep 30 09:10:22 crc kubenswrapper[4760]: I0930 09:10:22.736234 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sjdgg/crc-debug-vbrjq" Sep 30 09:10:22 crc kubenswrapper[4760]: I0930 09:10:22.923910 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5tgj\" (UniqueName: \"kubernetes.io/projected/04e4c535-14d9-499c-bcee-1b1c805d3078-kube-api-access-s5tgj\") pod \"04e4c535-14d9-499c-bcee-1b1c805d3078\" (UID: \"04e4c535-14d9-499c-bcee-1b1c805d3078\") " Sep 30 09:10:22 crc kubenswrapper[4760]: I0930 09:10:22.924186 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04e4c535-14d9-499c-bcee-1b1c805d3078-host\") pod \"04e4c535-14d9-499c-bcee-1b1c805d3078\" (UID: \"04e4c535-14d9-499c-bcee-1b1c805d3078\") " Sep 30 09:10:22 crc kubenswrapper[4760]: I0930 09:10:22.924615 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04e4c535-14d9-499c-bcee-1b1c805d3078-host" (OuterVolumeSpecName: "host") pod "04e4c535-14d9-499c-bcee-1b1c805d3078" (UID: "04e4c535-14d9-499c-bcee-1b1c805d3078"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 09:10:22 crc kubenswrapper[4760]: I0930 09:10:22.941941 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04e4c535-14d9-499c-bcee-1b1c805d3078-kube-api-access-s5tgj" (OuterVolumeSpecName: "kube-api-access-s5tgj") pod "04e4c535-14d9-499c-bcee-1b1c805d3078" (UID: "04e4c535-14d9-499c-bcee-1b1c805d3078"). InnerVolumeSpecName "kube-api-access-s5tgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 09:10:23 crc kubenswrapper[4760]: I0930 09:10:23.025975 4760 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04e4c535-14d9-499c-bcee-1b1c805d3078-host\") on node \"crc\" DevicePath \"\"" Sep 30 09:10:23 crc kubenswrapper[4760]: I0930 09:10:23.026006 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5tgj\" (UniqueName: \"kubernetes.io/projected/04e4c535-14d9-499c-bcee-1b1c805d3078-kube-api-access-s5tgj\") on node \"crc\" DevicePath \"\"" Sep 30 09:10:23 crc kubenswrapper[4760]: I0930 09:10:23.066752 4760 scope.go:117] "RemoveContainer" containerID="505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079" Sep 30 09:10:23 crc kubenswrapper[4760]: E0930 09:10:23.067063 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:10:23 crc kubenswrapper[4760]: I0930 09:10:23.076649 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04e4c535-14d9-499c-bcee-1b1c805d3078" path="/var/lib/kubelet/pods/04e4c535-14d9-499c-bcee-1b1c805d3078/volumes" Sep 30 09:10:23 crc kubenswrapper[4760]: E0930 09:10:23.208676 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04e4c535_14d9_499c_bcee_1b1c805d3078.slice\": RecentStats: unable to find data in memory cache]" Sep 30 09:10:23 crc kubenswrapper[4760]: I0930 09:10:23.347218 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh_59175508-6983-4846-a813-05181244346d/util/0.log" Sep 30 09:10:23 crc kubenswrapper[4760]: I0930 09:10:23.564484 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh_59175508-6983-4846-a813-05181244346d/pull/0.log" Sep 30 09:10:23 crc kubenswrapper[4760]: I0930 09:10:23.570074 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh_59175508-6983-4846-a813-05181244346d/util/0.log" Sep 30 09:10:23 crc kubenswrapper[4760]: I0930 09:10:23.570822 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh_59175508-6983-4846-a813-05181244346d/pull/0.log" Sep 30 09:10:23 crc kubenswrapper[4760]: I0930 09:10:23.647314 4760 scope.go:117] "RemoveContainer" containerID="832354125c7faf0e990f5fa40f4f6b9cf125b638e35a83c94b475e4bc1fd9dcd" Sep 30 09:10:23 crc kubenswrapper[4760]: I0930 09:10:23.647334 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sjdgg/crc-debug-vbrjq" Sep 30 09:10:23 crc kubenswrapper[4760]: I0930 09:10:23.747837 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh_59175508-6983-4846-a813-05181244346d/util/0.log" Sep 30 09:10:23 crc kubenswrapper[4760]: I0930 09:10:23.752066 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh_59175508-6983-4846-a813-05181244346d/pull/0.log" Sep 30 09:10:23 crc kubenswrapper[4760]: I0930 09:10:23.784052 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh_59175508-6983-4846-a813-05181244346d/extract/0.log" Sep 30 09:10:23 crc kubenswrapper[4760]: I0930 09:10:23.956498 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-n528w_90fe11d3-6b6b-46c3-9833-d68d080144b9/kube-rbac-proxy/0.log" Sep 30 09:10:23 crc kubenswrapper[4760]: I0930 09:10:23.975808 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-s4vf9_7fdb76d3-726a-416a-9b64-df2d6a67d88a/kube-rbac-proxy/0.log" Sep 30 09:10:23 crc kubenswrapper[4760]: I0930 09:10:23.986763 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-n528w_90fe11d3-6b6b-46c3-9833-d68d080144b9/manager/0.log" Sep 30 09:10:24 crc kubenswrapper[4760]: I0930 09:10:24.171510 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-s4vf9_7fdb76d3-726a-416a-9b64-df2d6a67d88a/manager/0.log" Sep 30 09:10:24 crc kubenswrapper[4760]: I0930 09:10:24.176480 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-2zwsf_0673130a-0175-41b4-a8d8-188c7a39caa0/manager/0.log" Sep 30 09:10:24 crc kubenswrapper[4760]: I0930 09:10:24.184663 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-2zwsf_0673130a-0175-41b4-a8d8-188c7a39caa0/kube-rbac-proxy/0.log" Sep 30 09:10:24 crc kubenswrapper[4760]: I0930 09:10:24.359784 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-mcrx5_93eb25ad-5a9d-4044-ba79-8869b28787dd/kube-rbac-proxy/0.log" Sep 30 09:10:24 crc kubenswrapper[4760]: I0930 09:10:24.424680 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-mcrx5_93eb25ad-5a9d-4044-ba79-8869b28787dd/manager/0.log" Sep 30 09:10:24 crc kubenswrapper[4760]: I0930 09:10:24.525059 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-7brm5_fff7432b-8ea3-4b35-8726-640f02bd8d58/kube-rbac-proxy/0.log" Sep 30 09:10:24 crc kubenswrapper[4760]: I0930 09:10:24.557513 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-7brm5_fff7432b-8ea3-4b35-8726-640f02bd8d58/manager/0.log" Sep 30 09:10:24 crc kubenswrapper[4760]: I0930 09:10:24.612034 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-j5vp7_28a5f605-2c82-4747-8b3d-2704804e81ec/kube-rbac-proxy/0.log" Sep 30 09:10:24 crc kubenswrapper[4760]: I0930 09:10:24.734783 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-j5vp7_28a5f605-2c82-4747-8b3d-2704804e81ec/manager/0.log" Sep 30 09:10:24 crc kubenswrapper[4760]: I0930 09:10:24.771483 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-nqdfs_18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b/kube-rbac-proxy/0.log" Sep 30 09:10:24 crc kubenswrapper[4760]: I0930 09:10:24.941367 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-7d78g_626c03a1-0630-42af-a1c4-af6e2c3584a5/kube-rbac-proxy/0.log" Sep 30 09:10:24 crc kubenswrapper[4760]: I0930 09:10:24.942505 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-nqdfs_18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b/manager/0.log" Sep 30 09:10:24 crc kubenswrapper[4760]: I0930 09:10:24.980625 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-7d78g_626c03a1-0630-42af-a1c4-af6e2c3584a5/manager/0.log" Sep 30 09:10:25 crc kubenswrapper[4760]: I0930 09:10:25.146687 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-rvrn9_27eb71ec-2145-426e-86fd-f31166b969e8/kube-rbac-proxy/0.log" Sep 30 09:10:25 crc kubenswrapper[4760]: I0930 09:10:25.203795 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-rvrn9_27eb71ec-2145-426e-86fd-f31166b969e8/manager/0.log" Sep 30 09:10:25 crc kubenswrapper[4760]: I0930 09:10:25.298572 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-5k4x2_26e8229c-cd7b-4eab-a36c-e94d5a367224/kube-rbac-proxy/0.log" Sep 30 09:10:25 crc kubenswrapper[4760]: I0930 09:10:25.338540 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-5k4x2_26e8229c-cd7b-4eab-a36c-e94d5a367224/manager/0.log" Sep 30 09:10:25 crc kubenswrapper[4760]: I0930 09:10:25.393985 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-69mqs_c4bab529-6936-4f18-b4c9-4d8202e1cf6a/kube-rbac-proxy/0.log" Sep 30 09:10:25 crc kubenswrapper[4760]: I0930 09:10:25.467544 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-69mqs_c4bab529-6936-4f18-b4c9-4d8202e1cf6a/manager/0.log" Sep 30 09:10:25 crc kubenswrapper[4760]: I0930 09:10:25.572034 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-gnfhj_070b883a-da84-454e-a2d3-cc43fbf5251a/kube-rbac-proxy/0.log" Sep 30 09:10:25 crc kubenswrapper[4760]: I0930 09:10:25.643230 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-gnfhj_070b883a-da84-454e-a2d3-cc43fbf5251a/manager/0.log" Sep 30 09:10:25 crc kubenswrapper[4760]: I0930 09:10:25.708878 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-95qms_c0be2186-ebe8-4634-942e-fcf6f5c0fdf6/kube-rbac-proxy/0.log" Sep 30 09:10:25 crc kubenswrapper[4760]: I0930 09:10:25.887276 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-95qms_c0be2186-ebe8-4634-942e-fcf6f5c0fdf6/manager/0.log" Sep 30 09:10:25 crc kubenswrapper[4760]: I0930 09:10:25.895922 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-p2qg2_be15e869-eae3-4164-a9b3-ba2d16238186/kube-rbac-proxy/0.log" Sep 30 09:10:25 crc kubenswrapper[4760]: I0930 09:10:25.941853 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-p2qg2_be15e869-eae3-4164-a9b3-ba2d16238186/manager/0.log" Sep 30 09:10:26 crc kubenswrapper[4760]: I0930 09:10:26.090864 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-qdj9t_7fac6c59-9344-46b8-b4ce-30b80c6a8b53/manager/0.log" Sep 30 09:10:26 crc kubenswrapper[4760]: I0930 09:10:26.091121 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-qdj9t_7fac6c59-9344-46b8-b4ce-30b80c6a8b53/kube-rbac-proxy/0.log" Sep 30 09:10:26 crc kubenswrapper[4760]: I0930 09:10:26.189641 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-df6bd9948-rjq2r_fe420b73-f7ff-40e5-8b63-475e61942e3d/kube-rbac-proxy/0.log" Sep 30 09:10:26 crc kubenswrapper[4760]: I0930 09:10:26.328325 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-799b749c5f-bxgqk_ecfaa27d-a6ba-432f-8a63-80706fcdf76a/kube-rbac-proxy/0.log" Sep 30 09:10:26 crc kubenswrapper[4760]: I0930 09:10:26.575369 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-799b749c5f-bxgqk_ecfaa27d-a6ba-432f-8a63-80706fcdf76a/operator/0.log" Sep 30 09:10:26 crc kubenswrapper[4760]: I0930 09:10:26.580489 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-hpsst_c076d5bf-bc69-4e76-b891-4d5c4387d68c/registry-server/0.log" Sep 30 09:10:26 crc kubenswrapper[4760]: I0930 09:10:26.779671 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-64gg8_8bc774be-38eb-4c0a-9c02-fb39c645cc28/kube-rbac-proxy/0.log" Sep 30 09:10:26 crc kubenswrapper[4760]: I0930 09:10:26.844003 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-64gg8_8bc774be-38eb-4c0a-9c02-fb39c645cc28/manager/0.log" Sep 30 09:10:26 crc kubenswrapper[4760]: I0930 09:10:26.946558 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-mzwdq_000713c9-22e2-4251-b81d-e1d47a48184e/kube-rbac-proxy/0.log" Sep 30 09:10:27 crc kubenswrapper[4760]: I0930 09:10:27.047616 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-mzwdq_000713c9-22e2-4251-b81d-e1d47a48184e/manager/0.log" Sep 30 09:10:27 crc kubenswrapper[4760]: I0930 09:10:27.132926 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-79d8469568-whhn2_f7777c80-60ad-47c2-a76a-002f99b89d61/operator/0.log" Sep 30 09:10:27 crc kubenswrapper[4760]: I0930 09:10:27.277554 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-kbsdn_7a562d30-ce00-4dca-9792-6687cf729825/kube-rbac-proxy/0.log" Sep 30 09:10:27 crc kubenswrapper[4760]: I0930 09:10:27.350462 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-kbsdn_7a562d30-ce00-4dca-9792-6687cf729825/manager/0.log" Sep 30 09:10:27 crc kubenswrapper[4760]: I0930 09:10:27.400551 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-wwngp_31832467-ab15-475b-a71b-7263e64cdff9/kube-rbac-proxy/0.log" Sep 30 09:10:27 crc kubenswrapper[4760]: I0930 09:10:27.622493 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-l4g6m_cd75a50f-b3a1-4bef-ac18-e574ef6815ec/kube-rbac-proxy/0.log" Sep 30 09:10:27 crc kubenswrapper[4760]: I0930 09:10:27.632983 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-df6bd9948-rjq2r_fe420b73-f7ff-40e5-8b63-475e61942e3d/manager/0.log" Sep 30 09:10:27 crc kubenswrapper[4760]: I0930 09:10:27.681015 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-l4g6m_cd75a50f-b3a1-4bef-ac18-e574ef6815ec/manager/0.log" Sep 30 09:10:27 crc kubenswrapper[4760]: I0930 09:10:27.758042 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-wwngp_31832467-ab15-475b-a71b-7263e64cdff9/manager/0.log" Sep 30 09:10:27 crc kubenswrapper[4760]: I0930 09:10:27.824654 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c459b467f-xxhnp_08bd2560-a223-4d1d-abf6-cf3686f1ded2/kube-rbac-proxy/0.log" Sep 30 09:10:27 crc kubenswrapper[4760]: I0930 09:10:27.891627 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c459b467f-xxhnp_08bd2560-a223-4d1d-abf6-cf3686f1ded2/manager/0.log" Sep 30 09:10:34 crc kubenswrapper[4760]: I0930 09:10:34.066727 4760 scope.go:117] "RemoveContainer" containerID="505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079" Sep 30 09:10:34 crc kubenswrapper[4760]: E0930 09:10:34.067341 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:10:42 crc kubenswrapper[4760]: I0930 09:10:42.882501 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-tlrrq_a96a9516-5f80-4391-a1f2-f4b7531e65fa/control-plane-machine-set-operator/0.log" Sep 30 09:10:43 crc kubenswrapper[4760]: I0930 09:10:43.051154 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4lv9x_2fb43f32-6ad4-4450-8a05-80570020d5e8/kube-rbac-proxy/0.log" Sep 30 09:10:43 crc kubenswrapper[4760]: I0930 09:10:43.072628 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4lv9x_2fb43f32-6ad4-4450-8a05-80570020d5e8/machine-api-operator/0.log" Sep 30 09:10:45 crc kubenswrapper[4760]: I0930 09:10:45.074074 4760 scope.go:117] "RemoveContainer" containerID="505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079" Sep 30 09:10:45 crc kubenswrapper[4760]: E0930 09:10:45.074551 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:10:54 crc kubenswrapper[4760]: I0930 09:10:54.478657 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-7f49c_6c1a0dc3-5f08-4216-99c7-ef1889df0775/cert-manager-controller/0.log" Sep 30 09:10:54 crc kubenswrapper[4760]: I0930 09:10:54.651888 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-9kkx2_59941c26-7746-44e9-8453-21d64dbdb91b/cert-manager-cainjector/0.log" Sep 30 09:10:54 crc kubenswrapper[4760]: I0930 09:10:54.714592 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-zptcg_49a05254-e89a-4b7c-b128-0a50daab0f7d/cert-manager-webhook/0.log" Sep 30 09:11:00 crc kubenswrapper[4760]: I0930 09:11:00.067293 4760 scope.go:117] "RemoveContainer" containerID="505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079" Sep 30 09:11:00 crc kubenswrapper[4760]: E0930 09:11:00.068264 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:11:06 crc kubenswrapper[4760]: I0930 09:11:06.193144 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-9kztb_a755b893-8456-4ee5-88cd-6e38a665c659/nmstate-console-plugin/0.log" Sep 30 09:11:06 crc kubenswrapper[4760]: I0930 09:11:06.319999 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-fqhzn_9c04978e-fe0a-4324-b6ce-b9b6b70bf305/nmstate-handler/0.log" Sep 30 09:11:06 crc kubenswrapper[4760]: I0930 09:11:06.348090 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-lt9bl_5f924622-9974-450d-b3a1-bb5fc8100ad6/kube-rbac-proxy/0.log" Sep 30 09:11:06 crc kubenswrapper[4760]: I0930 09:11:06.414387 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-lt9bl_5f924622-9974-450d-b3a1-bb5fc8100ad6/nmstate-metrics/0.log" Sep 30 09:11:06 crc kubenswrapper[4760]: I0930 09:11:06.522753 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-g68gd_40eb0a4f-5fde-42fa-a5c0-283ccab9a683/nmstate-operator/0.log" Sep 30 09:11:06 crc kubenswrapper[4760]: I0930 09:11:06.614559 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-p594d_759b23d3-847f-4d3a-9141-5c2cfad8664b/nmstate-webhook/0.log" Sep 30 09:11:15 crc kubenswrapper[4760]: I0930 09:11:15.076859 4760 scope.go:117] "RemoveContainer" containerID="505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079" Sep 30 09:11:15 crc kubenswrapper[4760]: E0930 09:11:15.077805 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:11:19 crc kubenswrapper[4760]: I0930 09:11:19.980415 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-ldwsd_670bf4cb-7ea4-4ffb-af92-0f727878a518/kube-rbac-proxy/0.log" Sep 30 09:11:20 crc kubenswrapper[4760]: I0930 09:11:20.201501 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-ldwsd_670bf4cb-7ea4-4ffb-af92-0f727878a518/controller/0.log" Sep 30 09:11:20 crc kubenswrapper[4760]: I0930 09:11:20.262270 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/cp-frr-files/0.log" Sep 30 09:11:20 crc kubenswrapper[4760]: I0930 09:11:20.462328 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/cp-reloader/0.log" Sep 30 09:11:20 crc kubenswrapper[4760]: I0930 09:11:20.487073 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/cp-frr-files/0.log" Sep 30 09:11:20 crc kubenswrapper[4760]: I0930 09:11:20.504361 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/cp-metrics/0.log" Sep 30 09:11:20 crc kubenswrapper[4760]: I0930 09:11:20.505061 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/cp-reloader/0.log" Sep 30 09:11:20 crc kubenswrapper[4760]: I0930 09:11:20.631027 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/cp-frr-files/0.log" Sep 30 09:11:20 crc kubenswrapper[4760]: I0930 09:11:20.680404 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/cp-metrics/0.log" Sep 30 09:11:20 crc kubenswrapper[4760]: I0930 09:11:20.692523 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/cp-reloader/0.log" Sep 30 09:11:20 crc kubenswrapper[4760]: I0930 09:11:20.699607 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/cp-metrics/0.log" Sep 30 09:11:20 crc kubenswrapper[4760]: I0930 09:11:20.886194 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/cp-frr-files/0.log" Sep 30 09:11:20 crc kubenswrapper[4760]: I0930 09:11:20.909790 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/cp-metrics/0.log" Sep 30 09:11:20 crc kubenswrapper[4760]: I0930 09:11:20.938036 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/controller/0.log" Sep 30 09:11:20 crc kubenswrapper[4760]: I0930 09:11:20.943007 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/cp-reloader/0.log" Sep 30 09:11:21 crc kubenswrapper[4760]: I0930 09:11:21.104186 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/frr-metrics/0.log" Sep 30 09:11:21 crc kubenswrapper[4760]: I0930 09:11:21.154800 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/kube-rbac-proxy/0.log" Sep 30 09:11:21 crc kubenswrapper[4760]: I0930 09:11:21.181126 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/kube-rbac-proxy-frr/0.log" Sep 30 09:11:21 crc kubenswrapper[4760]: I0930 09:11:21.325199 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/reloader/0.log" Sep 30 09:11:21 crc kubenswrapper[4760]: I0930 09:11:21.381930 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-69f49_ffa35c7d-6788-4902-8863-7346389154cd/frr-k8s-webhook-server/0.log" Sep 30 09:11:21 crc kubenswrapper[4760]: I0930 09:11:21.600021 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7db464cf7c-k5lfr_900aa033-c62f-42f8-a964-9d0e113eca21/manager/0.log" Sep 30 09:11:21 crc kubenswrapper[4760]: I0930 09:11:21.811184 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-54c4f7bf85-ndtrq_be9587c7-9bbb-48ad-867a-1830129f24b3/webhook-server/0.log" Sep 30 09:11:21 crc kubenswrapper[4760]: I0930 09:11:21.853954 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ldvzg_e89684e7-e01f-4427-9479-999c5f101902/kube-rbac-proxy/0.log" Sep 30 09:11:22 crc kubenswrapper[4760]: I0930 09:11:22.494136 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ldvzg_e89684e7-e01f-4427-9479-999c5f101902/speaker/0.log" Sep 30 09:11:22 crc kubenswrapper[4760]: I0930 09:11:22.740773 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/frr/0.log" Sep 30 09:11:27 crc kubenswrapper[4760]: I0930 09:11:27.067173 4760 scope.go:117] "RemoveContainer" containerID="505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079" Sep 30 09:11:27 crc kubenswrapper[4760]: E0930 09:11:27.068175 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:11:34 crc kubenswrapper[4760]: I0930 09:11:34.275816 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8_309b7e9d-3273-4a4c-865d-9287bab3988f/util/0.log" Sep 30 09:11:34 crc kubenswrapper[4760]: I0930 09:11:34.457680 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8_309b7e9d-3273-4a4c-865d-9287bab3988f/util/0.log" Sep 30 09:11:34 crc kubenswrapper[4760]: I0930 09:11:34.464593 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8_309b7e9d-3273-4a4c-865d-9287bab3988f/pull/0.log" Sep 30 09:11:34 crc kubenswrapper[4760]: I0930 09:11:34.546607 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8_309b7e9d-3273-4a4c-865d-9287bab3988f/pull/0.log" Sep 30 09:11:34 crc kubenswrapper[4760]: I0930 09:11:34.675391 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8_309b7e9d-3273-4a4c-865d-9287bab3988f/util/0.log" Sep 30 09:11:34 crc kubenswrapper[4760]: I0930 09:11:34.679432 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8_309b7e9d-3273-4a4c-865d-9287bab3988f/pull/0.log" Sep 30 09:11:34 crc kubenswrapper[4760]: I0930 09:11:34.685336 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8_309b7e9d-3273-4a4c-865d-9287bab3988f/extract/0.log" Sep 30 09:11:34 crc kubenswrapper[4760]: I0930 09:11:34.830320 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f_32cb0fa2-d830-4589-8379-418cf93913d5/util/0.log" Sep 30 09:11:34 crc kubenswrapper[4760]: I0930 09:11:34.981947 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f_32cb0fa2-d830-4589-8379-418cf93913d5/util/0.log" Sep 30 09:11:35 crc kubenswrapper[4760]: I0930 09:11:35.014795 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f_32cb0fa2-d830-4589-8379-418cf93913d5/pull/0.log" Sep 30 09:11:35 crc kubenswrapper[4760]: I0930 09:11:35.035045 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f_32cb0fa2-d830-4589-8379-418cf93913d5/pull/0.log" Sep 30 09:11:35 crc kubenswrapper[4760]: I0930 09:11:35.201785 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f_32cb0fa2-d830-4589-8379-418cf93913d5/util/0.log" Sep 30 09:11:35 crc kubenswrapper[4760]: I0930 09:11:35.248534 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f_32cb0fa2-d830-4589-8379-418cf93913d5/pull/0.log" Sep 30 09:11:35 crc kubenswrapper[4760]: I0930 09:11:35.278762 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f_32cb0fa2-d830-4589-8379-418cf93913d5/extract/0.log" Sep 30 09:11:35 crc kubenswrapper[4760]: I0930 09:11:35.393320 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nkbp8_3f9a39d0-3377-49d8-b54f-6cfac198199f/extract-utilities/0.log" Sep 30 09:11:35 crc kubenswrapper[4760]: I0930 09:11:35.534292 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nkbp8_3f9a39d0-3377-49d8-b54f-6cfac198199f/extract-content/0.log" Sep 30 09:11:35 crc kubenswrapper[4760]: I0930 09:11:35.574364 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nkbp8_3f9a39d0-3377-49d8-b54f-6cfac198199f/extract-content/0.log" Sep 30 09:11:35 crc kubenswrapper[4760]: I0930 09:11:35.596728 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nkbp8_3f9a39d0-3377-49d8-b54f-6cfac198199f/extract-utilities/0.log" Sep 30 09:11:35 crc kubenswrapper[4760]: I0930 09:11:35.757946 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nkbp8_3f9a39d0-3377-49d8-b54f-6cfac198199f/extract-content/0.log" Sep 30 09:11:35 crc kubenswrapper[4760]: I0930 09:11:35.769792 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nkbp8_3f9a39d0-3377-49d8-b54f-6cfac198199f/extract-utilities/0.log" Sep 30 09:11:35 crc kubenswrapper[4760]: I0930 09:11:35.958135 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zmsbv_8065abae-3351-4daf-9aff-8bf97affce6a/extract-utilities/0.log" Sep 30 09:11:36 crc kubenswrapper[4760]: I0930 09:11:36.221601 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zmsbv_8065abae-3351-4daf-9aff-8bf97affce6a/extract-content/0.log" Sep 30 09:11:36 crc kubenswrapper[4760]: I0930 09:11:36.226259 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zmsbv_8065abae-3351-4daf-9aff-8bf97affce6a/extract-utilities/0.log" Sep 30 09:11:36 crc kubenswrapper[4760]: I0930 09:11:36.237563 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zmsbv_8065abae-3351-4daf-9aff-8bf97affce6a/extract-content/0.log" Sep 30 09:11:36 crc kubenswrapper[4760]: I0930 09:11:36.476198 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zmsbv_8065abae-3351-4daf-9aff-8bf97affce6a/extract-content/0.log" Sep 30 09:11:36 crc kubenswrapper[4760]: I0930 09:11:36.484044 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zmsbv_8065abae-3351-4daf-9aff-8bf97affce6a/extract-utilities/0.log" Sep 30 09:11:36 crc kubenswrapper[4760]: I0930 09:11:36.705085 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh_a0b17021-6ad1-473c-ba06-7d4ba8eb162a/util/0.log" Sep 30 09:11:36 crc kubenswrapper[4760]: I0930 09:11:36.713162 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nkbp8_3f9a39d0-3377-49d8-b54f-6cfac198199f/registry-server/0.log" Sep 30 09:11:36 crc kubenswrapper[4760]: I0930 09:11:36.838234 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zmsbv_8065abae-3351-4daf-9aff-8bf97affce6a/registry-server/0.log" Sep 30 09:11:36 crc kubenswrapper[4760]: I0930 09:11:36.911909 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh_a0b17021-6ad1-473c-ba06-7d4ba8eb162a/pull/0.log" Sep 30 09:11:36 crc kubenswrapper[4760]: I0930 09:11:36.929414 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh_a0b17021-6ad1-473c-ba06-7d4ba8eb162a/pull/0.log" Sep 30 09:11:36 crc kubenswrapper[4760]: I0930 09:11:36.960916 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh_a0b17021-6ad1-473c-ba06-7d4ba8eb162a/util/0.log" Sep 30 09:11:37 crc kubenswrapper[4760]: I0930 09:11:37.176519 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh_a0b17021-6ad1-473c-ba06-7d4ba8eb162a/util/0.log" Sep 30 09:11:37 crc kubenswrapper[4760]: I0930 09:11:37.184420 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh_a0b17021-6ad1-473c-ba06-7d4ba8eb162a/extract/0.log" Sep 30 09:11:37 crc kubenswrapper[4760]: I0930 09:11:37.191038 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh_a0b17021-6ad1-473c-ba06-7d4ba8eb162a/pull/0.log" Sep 30 09:11:37 crc kubenswrapper[4760]: I0930 09:11:37.343939 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-cqqsj_97e466f4-974e-4d3c-b041-c4d01ad15fb4/marketplace-operator/0.log" Sep 30 09:11:37 crc kubenswrapper[4760]: I0930 09:11:37.370332 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4k7l5_9b11c132-f36f-49dd-af15-c0d78004c669/extract-utilities/0.log" Sep 30 09:11:37 crc kubenswrapper[4760]: I0930 09:11:37.571822 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4k7l5_9b11c132-f36f-49dd-af15-c0d78004c669/extract-content/0.log" Sep 30 09:11:37 crc kubenswrapper[4760]: I0930 09:11:37.576327 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4k7l5_9b11c132-f36f-49dd-af15-c0d78004c669/extract-utilities/0.log" Sep 30 09:11:37 crc kubenswrapper[4760]: I0930 09:11:37.579253 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4k7l5_9b11c132-f36f-49dd-af15-c0d78004c669/extract-content/0.log" Sep 30 09:11:37 crc kubenswrapper[4760]: I0930 09:11:37.743791 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4k7l5_9b11c132-f36f-49dd-af15-c0d78004c669/extract-content/0.log" Sep 30 09:11:37 crc kubenswrapper[4760]: I0930 09:11:37.820727 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wh9l2_aaececc4-56fa-4aba-959e-0595d2cb7270/extract-utilities/0.log" Sep 30 09:11:37 crc kubenswrapper[4760]: I0930 09:11:37.838831 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4k7l5_9b11c132-f36f-49dd-af15-c0d78004c669/extract-utilities/0.log" Sep 30 09:11:37 crc kubenswrapper[4760]: I0930 09:11:37.996032 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wh9l2_aaececc4-56fa-4aba-959e-0595d2cb7270/extract-utilities/0.log" Sep 30 09:11:38 crc kubenswrapper[4760]: I0930 09:11:38.002141 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4k7l5_9b11c132-f36f-49dd-af15-c0d78004c669/registry-server/0.log" Sep 30 09:11:38 crc kubenswrapper[4760]: I0930 09:11:38.036972 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wh9l2_aaececc4-56fa-4aba-959e-0595d2cb7270/extract-content/0.log" Sep 30 09:11:38 crc kubenswrapper[4760]: I0930 09:11:38.042577 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wh9l2_aaececc4-56fa-4aba-959e-0595d2cb7270/extract-content/0.log" Sep 30 09:11:38 crc kubenswrapper[4760]: I0930 09:11:38.198550 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wh9l2_aaececc4-56fa-4aba-959e-0595d2cb7270/extract-content/0.log" Sep 30 09:11:38 crc kubenswrapper[4760]: I0930 09:11:38.202625 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wh9l2_aaececc4-56fa-4aba-959e-0595d2cb7270/extract-utilities/0.log" Sep 30 09:11:38 crc kubenswrapper[4760]: I0930 09:11:38.416118 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wh9l2_aaececc4-56fa-4aba-959e-0595d2cb7270/registry-server/0.log" Sep 30 09:11:42 crc kubenswrapper[4760]: I0930 09:11:42.067155 4760 scope.go:117] "RemoveContainer" containerID="505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079" Sep 30 09:11:42 crc kubenswrapper[4760]: E0930 09:11:42.068079 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:11:50 crc kubenswrapper[4760]: I0930 09:11:50.955655 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-gvl7c_c35951af-e973-4663-9db5-2c5ac164bbba/prometheus-operator/0.log" Sep 30 09:11:51 crc kubenswrapper[4760]: I0930 09:11:51.222993 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6899d445c8-g78wr_b7b18a96-fb82-48a3-a34e-ebea9ef3eb75/prometheus-operator-admission-webhook/0.log" Sep 30 09:11:51 crc kubenswrapper[4760]: I0930 09:11:51.230648 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6899d445c8-k9fmb_790a604d-1726-4fc9-8e29-e30af2f26616/prometheus-operator-admission-webhook/0.log" Sep 30 09:11:51 crc kubenswrapper[4760]: I0930 09:11:51.404822 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-cpxd4_04a90715-31eb-49fb-9682-0a211630eede/operator/0.log" Sep 30 09:11:51 crc kubenswrapper[4760]: I0930 09:11:51.470869 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-4pgmn_fed44a9b-44ce-4650-b854-6c84c8536c57/perses-operator/0.log" Sep 30 09:11:53 crc kubenswrapper[4760]: I0930 09:11:53.067510 4760 scope.go:117] "RemoveContainer" containerID="505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079" Sep 30 09:11:53 crc kubenswrapper[4760]: E0930 09:11:53.068520 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:12:04 crc kubenswrapper[4760]: I0930 09:12:04.067156 4760 scope.go:117] "RemoveContainer" containerID="505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079" Sep 30 09:12:04 crc kubenswrapper[4760]: E0930 09:12:04.068189 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:12:15 crc kubenswrapper[4760]: I0930 09:12:15.076962 4760 scope.go:117] "RemoveContainer" containerID="505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079" Sep 30 09:12:15 crc kubenswrapper[4760]: E0930 09:12:15.078954 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:12:29 crc kubenswrapper[4760]: I0930 09:12:29.068840 4760 scope.go:117] "RemoveContainer" containerID="505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079" Sep 30 09:12:29 crc kubenswrapper[4760]: E0930 09:12:29.070775 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:12:40 crc kubenswrapper[4760]: I0930 09:12:40.067417 4760 scope.go:117] "RemoveContainer" containerID="505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079" Sep 30 09:12:40 crc kubenswrapper[4760]: E0930 09:12:40.068541 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:12:51 crc kubenswrapper[4760]: I0930 09:12:51.069466 4760 scope.go:117] "RemoveContainer" containerID="505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079" Sep 30 09:12:51 crc kubenswrapper[4760]: E0930 09:12:51.070229 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:13:03 crc kubenswrapper[4760]: I0930 09:13:03.905410 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6ns58"] Sep 30 09:13:03 crc kubenswrapper[4760]: E0930 09:13:03.906954 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04e4c535-14d9-499c-bcee-1b1c805d3078" containerName="container-00" Sep 30 09:13:03 crc kubenswrapper[4760]: I0930 09:13:03.906974 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="04e4c535-14d9-499c-bcee-1b1c805d3078" containerName="container-00" Sep 30 09:13:03 crc kubenswrapper[4760]: I0930 09:13:03.907410 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="04e4c535-14d9-499c-bcee-1b1c805d3078" containerName="container-00" Sep 30 09:13:03 crc kubenswrapper[4760]: I0930 09:13:03.909668 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6ns58" Sep 30 09:13:03 crc kubenswrapper[4760]: I0930 09:13:03.930425 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6ns58"] Sep 30 09:13:04 crc kubenswrapper[4760]: I0930 09:13:04.042900 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9501998-5e1e-4199-b05c-56e913ff558f-utilities\") pod \"redhat-operators-6ns58\" (UID: \"d9501998-5e1e-4199-b05c-56e913ff558f\") " pod="openshift-marketplace/redhat-operators-6ns58" Sep 30 09:13:04 crc kubenswrapper[4760]: I0930 09:13:04.043287 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9501998-5e1e-4199-b05c-56e913ff558f-catalog-content\") pod \"redhat-operators-6ns58\" (UID: \"d9501998-5e1e-4199-b05c-56e913ff558f\") " pod="openshift-marketplace/redhat-operators-6ns58" Sep 30 09:13:04 crc kubenswrapper[4760]: I0930 09:13:04.043503 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dhbx\" (UniqueName: \"kubernetes.io/projected/d9501998-5e1e-4199-b05c-56e913ff558f-kube-api-access-2dhbx\") pod \"redhat-operators-6ns58\" (UID: \"d9501998-5e1e-4199-b05c-56e913ff558f\") " pod="openshift-marketplace/redhat-operators-6ns58" Sep 30 09:13:04 crc kubenswrapper[4760]: I0930 09:13:04.145997 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9501998-5e1e-4199-b05c-56e913ff558f-utilities\") pod \"redhat-operators-6ns58\" (UID: \"d9501998-5e1e-4199-b05c-56e913ff558f\") " pod="openshift-marketplace/redhat-operators-6ns58" Sep 30 09:13:04 crc kubenswrapper[4760]: I0930 09:13:04.146161 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9501998-5e1e-4199-b05c-56e913ff558f-catalog-content\") pod \"redhat-operators-6ns58\" (UID: \"d9501998-5e1e-4199-b05c-56e913ff558f\") " pod="openshift-marketplace/redhat-operators-6ns58" Sep 30 09:13:04 crc kubenswrapper[4760]: I0930 09:13:04.146220 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dhbx\" (UniqueName: \"kubernetes.io/projected/d9501998-5e1e-4199-b05c-56e913ff558f-kube-api-access-2dhbx\") pod \"redhat-operators-6ns58\" (UID: \"d9501998-5e1e-4199-b05c-56e913ff558f\") " pod="openshift-marketplace/redhat-operators-6ns58" Sep 30 09:13:04 crc kubenswrapper[4760]: I0930 09:13:04.146940 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9501998-5e1e-4199-b05c-56e913ff558f-utilities\") pod \"redhat-operators-6ns58\" (UID: \"d9501998-5e1e-4199-b05c-56e913ff558f\") " pod="openshift-marketplace/redhat-operators-6ns58" Sep 30 09:13:04 crc kubenswrapper[4760]: I0930 09:13:04.146948 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9501998-5e1e-4199-b05c-56e913ff558f-catalog-content\") pod \"redhat-operators-6ns58\" (UID: \"d9501998-5e1e-4199-b05c-56e913ff558f\") " pod="openshift-marketplace/redhat-operators-6ns58" Sep 30 09:13:04 crc kubenswrapper[4760]: I0930 09:13:04.174528 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dhbx\" (UniqueName: \"kubernetes.io/projected/d9501998-5e1e-4199-b05c-56e913ff558f-kube-api-access-2dhbx\") pod \"redhat-operators-6ns58\" (UID: \"d9501998-5e1e-4199-b05c-56e913ff558f\") " pod="openshift-marketplace/redhat-operators-6ns58" Sep 30 09:13:04 crc kubenswrapper[4760]: I0930 09:13:04.235783 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6ns58" Sep 30 09:13:04 crc kubenswrapper[4760]: I0930 09:13:04.720390 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6ns58"] Sep 30 09:13:05 crc kubenswrapper[4760]: I0930 09:13:05.357641 4760 generic.go:334] "Generic (PLEG): container finished" podID="d9501998-5e1e-4199-b05c-56e913ff558f" containerID="2cf06ba66b6d849e03077d470cad5650ac1e2d8e248b6e64df354ee7ea2a1e5e" exitCode=0 Sep 30 09:13:05 crc kubenswrapper[4760]: I0930 09:13:05.357723 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ns58" event={"ID":"d9501998-5e1e-4199-b05c-56e913ff558f","Type":"ContainerDied","Data":"2cf06ba66b6d849e03077d470cad5650ac1e2d8e248b6e64df354ee7ea2a1e5e"} Sep 30 09:13:05 crc kubenswrapper[4760]: I0930 09:13:05.358144 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ns58" event={"ID":"d9501998-5e1e-4199-b05c-56e913ff558f","Type":"ContainerStarted","Data":"e2b9ae2d1e09066206a0df6d0088ab69c324877f5b5639bfdcccc69146ae29e8"} Sep 30 09:13:05 crc kubenswrapper[4760]: I0930 09:13:05.360678 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 09:13:06 crc kubenswrapper[4760]: I0930 09:13:06.066992 4760 scope.go:117] "RemoveContainer" containerID="505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079" Sep 30 09:13:06 crc kubenswrapper[4760]: E0930 09:13:06.067893 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:13:06 crc kubenswrapper[4760]: I0930 09:13:06.367460 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ns58" event={"ID":"d9501998-5e1e-4199-b05c-56e913ff558f","Type":"ContainerStarted","Data":"b9cf258e980e8501a5bb480a1ed65fe0398df927c3d621611677c9b2ca76535c"} Sep 30 09:13:08 crc kubenswrapper[4760]: I0930 09:13:08.391353 4760 generic.go:334] "Generic (PLEG): container finished" podID="d9501998-5e1e-4199-b05c-56e913ff558f" containerID="b9cf258e980e8501a5bb480a1ed65fe0398df927c3d621611677c9b2ca76535c" exitCode=0 Sep 30 09:13:08 crc kubenswrapper[4760]: I0930 09:13:08.391433 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ns58" event={"ID":"d9501998-5e1e-4199-b05c-56e913ff558f","Type":"ContainerDied","Data":"b9cf258e980e8501a5bb480a1ed65fe0398df927c3d621611677c9b2ca76535c"} Sep 30 09:13:09 crc kubenswrapper[4760]: I0930 09:13:09.402507 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ns58" event={"ID":"d9501998-5e1e-4199-b05c-56e913ff558f","Type":"ContainerStarted","Data":"5cb0968b9ec0f50d629c5f0427dc0078f226bc202c6d0315af230e682815aebe"} Sep 30 09:13:09 crc kubenswrapper[4760]: I0930 09:13:09.432738 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6ns58" podStartSLOduration=2.968471395 podStartE2EDuration="6.432716176s" podCreationTimestamp="2025-09-30 09:13:03 +0000 UTC" firstStartedPulling="2025-09-30 09:13:05.360179817 +0000 UTC m=+5971.003086259" lastFinishedPulling="2025-09-30 09:13:08.824424628 +0000 UTC m=+5974.467331040" observedRunningTime="2025-09-30 09:13:09.429510204 +0000 UTC m=+5975.072416616" watchObservedRunningTime="2025-09-30 09:13:09.432716176 +0000 UTC m=+5975.075622588" Sep 30 09:13:11 crc kubenswrapper[4760]: I0930 09:13:11.252493 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4pjs2"] Sep 30 09:13:11 crc kubenswrapper[4760]: I0930 09:13:11.254859 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4pjs2" Sep 30 09:13:11 crc kubenswrapper[4760]: I0930 09:13:11.336142 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4pjs2"] Sep 30 09:13:11 crc kubenswrapper[4760]: I0930 09:13:11.387499 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b60bd70e-764e-4eda-a145-a5a8cd93536b-utilities\") pod \"community-operators-4pjs2\" (UID: \"b60bd70e-764e-4eda-a145-a5a8cd93536b\") " pod="openshift-marketplace/community-operators-4pjs2" Sep 30 09:13:11 crc kubenswrapper[4760]: I0930 09:13:11.387559 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b60bd70e-764e-4eda-a145-a5a8cd93536b-catalog-content\") pod \"community-operators-4pjs2\" (UID: \"b60bd70e-764e-4eda-a145-a5a8cd93536b\") " pod="openshift-marketplace/community-operators-4pjs2" Sep 30 09:13:11 crc kubenswrapper[4760]: I0930 09:13:11.387609 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-274xl\" (UniqueName: \"kubernetes.io/projected/b60bd70e-764e-4eda-a145-a5a8cd93536b-kube-api-access-274xl\") pod \"community-operators-4pjs2\" (UID: \"b60bd70e-764e-4eda-a145-a5a8cd93536b\") " pod="openshift-marketplace/community-operators-4pjs2" Sep 30 09:13:11 crc kubenswrapper[4760]: I0930 09:13:11.490537 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b60bd70e-764e-4eda-a145-a5a8cd93536b-utilities\") pod \"community-operators-4pjs2\" (UID: \"b60bd70e-764e-4eda-a145-a5a8cd93536b\") " pod="openshift-marketplace/community-operators-4pjs2" Sep 30 09:13:11 crc kubenswrapper[4760]: I0930 09:13:11.490597 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b60bd70e-764e-4eda-a145-a5a8cd93536b-catalog-content\") pod \"community-operators-4pjs2\" (UID: \"b60bd70e-764e-4eda-a145-a5a8cd93536b\") " pod="openshift-marketplace/community-operators-4pjs2" Sep 30 09:13:11 crc kubenswrapper[4760]: I0930 09:13:11.490635 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-274xl\" (UniqueName: \"kubernetes.io/projected/b60bd70e-764e-4eda-a145-a5a8cd93536b-kube-api-access-274xl\") pod \"community-operators-4pjs2\" (UID: \"b60bd70e-764e-4eda-a145-a5a8cd93536b\") " pod="openshift-marketplace/community-operators-4pjs2" Sep 30 09:13:11 crc kubenswrapper[4760]: I0930 09:13:11.491042 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b60bd70e-764e-4eda-a145-a5a8cd93536b-utilities\") pod \"community-operators-4pjs2\" (UID: \"b60bd70e-764e-4eda-a145-a5a8cd93536b\") " pod="openshift-marketplace/community-operators-4pjs2" Sep 30 09:13:11 crc kubenswrapper[4760]: I0930 09:13:11.491077 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b60bd70e-764e-4eda-a145-a5a8cd93536b-catalog-content\") pod \"community-operators-4pjs2\" (UID: \"b60bd70e-764e-4eda-a145-a5a8cd93536b\") " pod="openshift-marketplace/community-operators-4pjs2" Sep 30 09:13:11 crc kubenswrapper[4760]: I0930 09:13:11.523432 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-274xl\" (UniqueName: \"kubernetes.io/projected/b60bd70e-764e-4eda-a145-a5a8cd93536b-kube-api-access-274xl\") pod \"community-operators-4pjs2\" (UID: \"b60bd70e-764e-4eda-a145-a5a8cd93536b\") " pod="openshift-marketplace/community-operators-4pjs2" Sep 30 09:13:11 crc kubenswrapper[4760]: I0930 09:13:11.574642 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4pjs2" Sep 30 09:13:12 crc kubenswrapper[4760]: I0930 09:13:12.167624 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4pjs2"] Sep 30 09:13:12 crc kubenswrapper[4760]: I0930 09:13:12.429880 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pjs2" event={"ID":"b60bd70e-764e-4eda-a145-a5a8cd93536b","Type":"ContainerStarted","Data":"57e8e523752d8c8c6c4b16f8160081ace4ae43d0b0874a15993a7e8365494e0c"} Sep 30 09:13:13 crc kubenswrapper[4760]: I0930 09:13:13.441759 4760 generic.go:334] "Generic (PLEG): container finished" podID="b60bd70e-764e-4eda-a145-a5a8cd93536b" containerID="71986b50faba1e65436c39ab612fbf21bb949d08a5f063154a470f793f398ae2" exitCode=0 Sep 30 09:13:13 crc kubenswrapper[4760]: I0930 09:13:13.441954 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pjs2" event={"ID":"b60bd70e-764e-4eda-a145-a5a8cd93536b","Type":"ContainerDied","Data":"71986b50faba1e65436c39ab612fbf21bb949d08a5f063154a470f793f398ae2"} Sep 30 09:13:14 crc kubenswrapper[4760]: I0930 09:13:14.236233 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6ns58" Sep 30 09:13:14 crc kubenswrapper[4760]: I0930 09:13:14.236550 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6ns58" Sep 30 09:13:14 crc kubenswrapper[4760]: I0930 09:13:14.290073 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6ns58" Sep 30 09:13:14 crc kubenswrapper[4760]: I0930 09:13:14.495088 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6ns58" Sep 30 09:13:15 crc kubenswrapper[4760]: I0930 09:13:15.445976 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6ns58"] Sep 30 09:13:15 crc kubenswrapper[4760]: I0930 09:13:15.467825 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pjs2" event={"ID":"b60bd70e-764e-4eda-a145-a5a8cd93536b","Type":"ContainerStarted","Data":"62348d17939c2f2fba7b06af529a321c04d12420165dd33fef83ccefc2b5a960"} Sep 30 09:13:16 crc kubenswrapper[4760]: I0930 09:13:16.482242 4760 generic.go:334] "Generic (PLEG): container finished" podID="b60bd70e-764e-4eda-a145-a5a8cd93536b" containerID="62348d17939c2f2fba7b06af529a321c04d12420165dd33fef83ccefc2b5a960" exitCode=0 Sep 30 09:13:16 crc kubenswrapper[4760]: I0930 09:13:16.482377 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pjs2" event={"ID":"b60bd70e-764e-4eda-a145-a5a8cd93536b","Type":"ContainerDied","Data":"62348d17939c2f2fba7b06af529a321c04d12420165dd33fef83ccefc2b5a960"} Sep 30 09:13:16 crc kubenswrapper[4760]: I0930 09:13:16.483056 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6ns58" podUID="d9501998-5e1e-4199-b05c-56e913ff558f" containerName="registry-server" containerID="cri-o://5cb0968b9ec0f50d629c5f0427dc0078f226bc202c6d0315af230e682815aebe" gracePeriod=2 Sep 30 09:13:17 crc kubenswrapper[4760]: I0930 09:13:17.136741 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6ns58" Sep 30 09:13:17 crc kubenswrapper[4760]: I0930 09:13:17.223868 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dhbx\" (UniqueName: \"kubernetes.io/projected/d9501998-5e1e-4199-b05c-56e913ff558f-kube-api-access-2dhbx\") pod \"d9501998-5e1e-4199-b05c-56e913ff558f\" (UID: \"d9501998-5e1e-4199-b05c-56e913ff558f\") " Sep 30 09:13:17 crc kubenswrapper[4760]: I0930 09:13:17.223936 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9501998-5e1e-4199-b05c-56e913ff558f-catalog-content\") pod \"d9501998-5e1e-4199-b05c-56e913ff558f\" (UID: \"d9501998-5e1e-4199-b05c-56e913ff558f\") " Sep 30 09:13:17 crc kubenswrapper[4760]: I0930 09:13:17.223983 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9501998-5e1e-4199-b05c-56e913ff558f-utilities\") pod \"d9501998-5e1e-4199-b05c-56e913ff558f\" (UID: \"d9501998-5e1e-4199-b05c-56e913ff558f\") " Sep 30 09:13:17 crc kubenswrapper[4760]: I0930 09:13:17.225338 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9501998-5e1e-4199-b05c-56e913ff558f-utilities" (OuterVolumeSpecName: "utilities") pod "d9501998-5e1e-4199-b05c-56e913ff558f" (UID: "d9501998-5e1e-4199-b05c-56e913ff558f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 09:13:17 crc kubenswrapper[4760]: I0930 09:13:17.229696 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9501998-5e1e-4199-b05c-56e913ff558f-kube-api-access-2dhbx" (OuterVolumeSpecName: "kube-api-access-2dhbx") pod "d9501998-5e1e-4199-b05c-56e913ff558f" (UID: "d9501998-5e1e-4199-b05c-56e913ff558f"). InnerVolumeSpecName "kube-api-access-2dhbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 09:13:17 crc kubenswrapper[4760]: I0930 09:13:17.326021 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9501998-5e1e-4199-b05c-56e913ff558f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 09:13:17 crc kubenswrapper[4760]: I0930 09:13:17.326415 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dhbx\" (UniqueName: \"kubernetes.io/projected/d9501998-5e1e-4199-b05c-56e913ff558f-kube-api-access-2dhbx\") on node \"crc\" DevicePath \"\"" Sep 30 09:13:17 crc kubenswrapper[4760]: I0930 09:13:17.492548 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pjs2" event={"ID":"b60bd70e-764e-4eda-a145-a5a8cd93536b","Type":"ContainerStarted","Data":"1d309751524719d31ab875d495719d48913ceb41c2e433bbcdef5351beb07922"} Sep 30 09:13:17 crc kubenswrapper[4760]: I0930 09:13:17.497134 4760 generic.go:334] "Generic (PLEG): container finished" podID="d9501998-5e1e-4199-b05c-56e913ff558f" containerID="5cb0968b9ec0f50d629c5f0427dc0078f226bc202c6d0315af230e682815aebe" exitCode=0 Sep 30 09:13:17 crc kubenswrapper[4760]: I0930 09:13:17.497218 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ns58" event={"ID":"d9501998-5e1e-4199-b05c-56e913ff558f","Type":"ContainerDied","Data":"5cb0968b9ec0f50d629c5f0427dc0078f226bc202c6d0315af230e682815aebe"} Sep 30 09:13:17 crc kubenswrapper[4760]: I0930 09:13:17.497541 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ns58" event={"ID":"d9501998-5e1e-4199-b05c-56e913ff558f","Type":"ContainerDied","Data":"e2b9ae2d1e09066206a0df6d0088ab69c324877f5b5639bfdcccc69146ae29e8"} Sep 30 09:13:17 crc kubenswrapper[4760]: I0930 09:13:17.497251 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6ns58" Sep 30 09:13:17 crc kubenswrapper[4760]: I0930 09:13:17.497619 4760 scope.go:117] "RemoveContainer" containerID="5cb0968b9ec0f50d629c5f0427dc0078f226bc202c6d0315af230e682815aebe" Sep 30 09:13:17 crc kubenswrapper[4760]: I0930 09:13:17.523415 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4pjs2" podStartSLOduration=2.987724913 podStartE2EDuration="6.52339186s" podCreationTimestamp="2025-09-30 09:13:11 +0000 UTC" firstStartedPulling="2025-09-30 09:13:13.444444248 +0000 UTC m=+5979.087350660" lastFinishedPulling="2025-09-30 09:13:16.980111165 +0000 UTC m=+5982.623017607" observedRunningTime="2025-09-30 09:13:17.514031192 +0000 UTC m=+5983.156937644" watchObservedRunningTime="2025-09-30 09:13:17.52339186 +0000 UTC m=+5983.166298272" Sep 30 09:13:17 crc kubenswrapper[4760]: I0930 09:13:17.541050 4760 scope.go:117] "RemoveContainer" containerID="b9cf258e980e8501a5bb480a1ed65fe0398df927c3d621611677c9b2ca76535c" Sep 30 09:13:17 crc kubenswrapper[4760]: I0930 09:13:17.563561 4760 scope.go:117] "RemoveContainer" containerID="2cf06ba66b6d849e03077d470cad5650ac1e2d8e248b6e64df354ee7ea2a1e5e" Sep 30 09:13:17 crc kubenswrapper[4760]: I0930 09:13:17.610119 4760 scope.go:117] "RemoveContainer" containerID="5cb0968b9ec0f50d629c5f0427dc0078f226bc202c6d0315af230e682815aebe" Sep 30 09:13:17 crc kubenswrapper[4760]: E0930 09:13:17.610748 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cb0968b9ec0f50d629c5f0427dc0078f226bc202c6d0315af230e682815aebe\": container with ID starting with 5cb0968b9ec0f50d629c5f0427dc0078f226bc202c6d0315af230e682815aebe not found: ID does not exist" containerID="5cb0968b9ec0f50d629c5f0427dc0078f226bc202c6d0315af230e682815aebe" Sep 30 09:13:17 crc kubenswrapper[4760]: I0930 09:13:17.610804 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cb0968b9ec0f50d629c5f0427dc0078f226bc202c6d0315af230e682815aebe"} err="failed to get container status \"5cb0968b9ec0f50d629c5f0427dc0078f226bc202c6d0315af230e682815aebe\": rpc error: code = NotFound desc = could not find container \"5cb0968b9ec0f50d629c5f0427dc0078f226bc202c6d0315af230e682815aebe\": container with ID starting with 5cb0968b9ec0f50d629c5f0427dc0078f226bc202c6d0315af230e682815aebe not found: ID does not exist" Sep 30 09:13:17 crc kubenswrapper[4760]: I0930 09:13:17.610829 4760 scope.go:117] "RemoveContainer" containerID="b9cf258e980e8501a5bb480a1ed65fe0398df927c3d621611677c9b2ca76535c" Sep 30 09:13:17 crc kubenswrapper[4760]: E0930 09:13:17.611208 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9cf258e980e8501a5bb480a1ed65fe0398df927c3d621611677c9b2ca76535c\": container with ID starting with b9cf258e980e8501a5bb480a1ed65fe0398df927c3d621611677c9b2ca76535c not found: ID does not exist" containerID="b9cf258e980e8501a5bb480a1ed65fe0398df927c3d621611677c9b2ca76535c" Sep 30 09:13:17 crc kubenswrapper[4760]: I0930 09:13:17.611265 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9cf258e980e8501a5bb480a1ed65fe0398df927c3d621611677c9b2ca76535c"} err="failed to get container status \"b9cf258e980e8501a5bb480a1ed65fe0398df927c3d621611677c9b2ca76535c\": rpc error: code = NotFound desc = could not find container \"b9cf258e980e8501a5bb480a1ed65fe0398df927c3d621611677c9b2ca76535c\": container with ID starting with b9cf258e980e8501a5bb480a1ed65fe0398df927c3d621611677c9b2ca76535c not found: ID does not exist" Sep 30 09:13:17 crc kubenswrapper[4760]: I0930 09:13:17.611298 4760 scope.go:117] "RemoveContainer" containerID="2cf06ba66b6d849e03077d470cad5650ac1e2d8e248b6e64df354ee7ea2a1e5e" Sep 30 09:13:17 crc kubenswrapper[4760]: E0930 09:13:17.611805 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cf06ba66b6d849e03077d470cad5650ac1e2d8e248b6e64df354ee7ea2a1e5e\": container with ID starting with 2cf06ba66b6d849e03077d470cad5650ac1e2d8e248b6e64df354ee7ea2a1e5e not found: ID does not exist" containerID="2cf06ba66b6d849e03077d470cad5650ac1e2d8e248b6e64df354ee7ea2a1e5e" Sep 30 09:13:17 crc kubenswrapper[4760]: I0930 09:13:17.611861 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cf06ba66b6d849e03077d470cad5650ac1e2d8e248b6e64df354ee7ea2a1e5e"} err="failed to get container status \"2cf06ba66b6d849e03077d470cad5650ac1e2d8e248b6e64df354ee7ea2a1e5e\": rpc error: code = NotFound desc = could not find container \"2cf06ba66b6d849e03077d470cad5650ac1e2d8e248b6e64df354ee7ea2a1e5e\": container with ID starting with 2cf06ba66b6d849e03077d470cad5650ac1e2d8e248b6e64df354ee7ea2a1e5e not found: ID does not exist" Sep 30 09:13:17 crc kubenswrapper[4760]: I0930 09:13:17.727768 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9501998-5e1e-4199-b05c-56e913ff558f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9501998-5e1e-4199-b05c-56e913ff558f" (UID: "d9501998-5e1e-4199-b05c-56e913ff558f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 09:13:17 crc kubenswrapper[4760]: I0930 09:13:17.740229 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9501998-5e1e-4199-b05c-56e913ff558f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 09:13:17 crc kubenswrapper[4760]: I0930 09:13:17.849975 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6ns58"] Sep 30 09:13:17 crc kubenswrapper[4760]: I0930 09:13:17.864532 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6ns58"] Sep 30 09:13:19 crc kubenswrapper[4760]: I0930 09:13:19.079346 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9501998-5e1e-4199-b05c-56e913ff558f" path="/var/lib/kubelet/pods/d9501998-5e1e-4199-b05c-56e913ff558f/volumes" Sep 30 09:13:21 crc kubenswrapper[4760]: I0930 09:13:21.069248 4760 scope.go:117] "RemoveContainer" containerID="505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079" Sep 30 09:13:21 crc kubenswrapper[4760]: I0930 09:13:21.569034 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerStarted","Data":"4a9ddef5425fff579d2f6273dac3ac3872c271566f3c6e2d32ae0cd93a2ffc46"} Sep 30 09:13:21 crc kubenswrapper[4760]: I0930 09:13:21.576902 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4pjs2" Sep 30 09:13:21 crc kubenswrapper[4760]: I0930 09:13:21.576950 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4pjs2" Sep 30 09:13:21 crc kubenswrapper[4760]: I0930 09:13:21.681362 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4pjs2" Sep 30 09:13:22 crc kubenswrapper[4760]: I0930 09:13:22.619760 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4pjs2" Sep 30 09:13:23 crc kubenswrapper[4760]: I0930 09:13:23.661416 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4pjs2"] Sep 30 09:13:24 crc kubenswrapper[4760]: I0930 09:13:24.604437 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4pjs2" podUID="b60bd70e-764e-4eda-a145-a5a8cd93536b" containerName="registry-server" containerID="cri-o://1d309751524719d31ab875d495719d48913ceb41c2e433bbcdef5351beb07922" gracePeriod=2 Sep 30 09:13:25 crc kubenswrapper[4760]: I0930 09:13:25.179010 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4pjs2" Sep 30 09:13:25 crc kubenswrapper[4760]: I0930 09:13:25.298591 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b60bd70e-764e-4eda-a145-a5a8cd93536b-utilities\") pod \"b60bd70e-764e-4eda-a145-a5a8cd93536b\" (UID: \"b60bd70e-764e-4eda-a145-a5a8cd93536b\") " Sep 30 09:13:25 crc kubenswrapper[4760]: I0930 09:13:25.299427 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b60bd70e-764e-4eda-a145-a5a8cd93536b-catalog-content\") pod \"b60bd70e-764e-4eda-a145-a5a8cd93536b\" (UID: \"b60bd70e-764e-4eda-a145-a5a8cd93536b\") " Sep 30 09:13:25 crc kubenswrapper[4760]: I0930 09:13:25.299730 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-274xl\" (UniqueName: \"kubernetes.io/projected/b60bd70e-764e-4eda-a145-a5a8cd93536b-kube-api-access-274xl\") pod \"b60bd70e-764e-4eda-a145-a5a8cd93536b\" (UID: \"b60bd70e-764e-4eda-a145-a5a8cd93536b\") " Sep 30 09:13:25 crc kubenswrapper[4760]: I0930 09:13:25.300044 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b60bd70e-764e-4eda-a145-a5a8cd93536b-utilities" (OuterVolumeSpecName: "utilities") pod "b60bd70e-764e-4eda-a145-a5a8cd93536b" (UID: "b60bd70e-764e-4eda-a145-a5a8cd93536b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 09:13:25 crc kubenswrapper[4760]: I0930 09:13:25.300691 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b60bd70e-764e-4eda-a145-a5a8cd93536b-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 09:13:25 crc kubenswrapper[4760]: I0930 09:13:25.307716 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b60bd70e-764e-4eda-a145-a5a8cd93536b-kube-api-access-274xl" (OuterVolumeSpecName: "kube-api-access-274xl") pod "b60bd70e-764e-4eda-a145-a5a8cd93536b" (UID: "b60bd70e-764e-4eda-a145-a5a8cd93536b"). InnerVolumeSpecName "kube-api-access-274xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 09:13:25 crc kubenswrapper[4760]: I0930 09:13:25.362779 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b60bd70e-764e-4eda-a145-a5a8cd93536b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b60bd70e-764e-4eda-a145-a5a8cd93536b" (UID: "b60bd70e-764e-4eda-a145-a5a8cd93536b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 09:13:25 crc kubenswrapper[4760]: I0930 09:13:25.403751 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b60bd70e-764e-4eda-a145-a5a8cd93536b-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 09:13:25 crc kubenswrapper[4760]: I0930 09:13:25.403838 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-274xl\" (UniqueName: \"kubernetes.io/projected/b60bd70e-764e-4eda-a145-a5a8cd93536b-kube-api-access-274xl\") on node \"crc\" DevicePath \"\"" Sep 30 09:13:25 crc kubenswrapper[4760]: I0930 09:13:25.614635 4760 generic.go:334] "Generic (PLEG): container finished" podID="b60bd70e-764e-4eda-a145-a5a8cd93536b" containerID="1d309751524719d31ab875d495719d48913ceb41c2e433bbcdef5351beb07922" exitCode=0 Sep 30 09:13:25 crc kubenswrapper[4760]: I0930 09:13:25.614681 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pjs2" event={"ID":"b60bd70e-764e-4eda-a145-a5a8cd93536b","Type":"ContainerDied","Data":"1d309751524719d31ab875d495719d48913ceb41c2e433bbcdef5351beb07922"} Sep 30 09:13:25 crc kubenswrapper[4760]: I0930 09:13:25.614712 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pjs2" event={"ID":"b60bd70e-764e-4eda-a145-a5a8cd93536b","Type":"ContainerDied","Data":"57e8e523752d8c8c6c4b16f8160081ace4ae43d0b0874a15993a7e8365494e0c"} Sep 30 09:13:25 crc kubenswrapper[4760]: I0930 09:13:25.614747 4760 scope.go:117] "RemoveContainer" containerID="1d309751524719d31ab875d495719d48913ceb41c2e433bbcdef5351beb07922" Sep 30 09:13:25 crc kubenswrapper[4760]: I0930 09:13:25.615266 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4pjs2" Sep 30 09:13:25 crc kubenswrapper[4760]: I0930 09:13:25.658889 4760 scope.go:117] "RemoveContainer" containerID="62348d17939c2f2fba7b06af529a321c04d12420165dd33fef83ccefc2b5a960" Sep 30 09:13:25 crc kubenswrapper[4760]: I0930 09:13:25.670067 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4pjs2"] Sep 30 09:13:25 crc kubenswrapper[4760]: I0930 09:13:25.686738 4760 scope.go:117] "RemoveContainer" containerID="71986b50faba1e65436c39ab612fbf21bb949d08a5f063154a470f793f398ae2" Sep 30 09:13:25 crc kubenswrapper[4760]: I0930 09:13:25.693797 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4pjs2"] Sep 30 09:13:25 crc kubenswrapper[4760]: I0930 09:13:25.719855 4760 scope.go:117] "RemoveContainer" containerID="1d309751524719d31ab875d495719d48913ceb41c2e433bbcdef5351beb07922" Sep 30 09:13:25 crc kubenswrapper[4760]: E0930 09:13:25.724727 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d309751524719d31ab875d495719d48913ceb41c2e433bbcdef5351beb07922\": container with ID starting with 1d309751524719d31ab875d495719d48913ceb41c2e433bbcdef5351beb07922 not found: ID does not exist" containerID="1d309751524719d31ab875d495719d48913ceb41c2e433bbcdef5351beb07922" Sep 30 09:13:25 crc kubenswrapper[4760]: I0930 09:13:25.724796 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d309751524719d31ab875d495719d48913ceb41c2e433bbcdef5351beb07922"} err="failed to get container status \"1d309751524719d31ab875d495719d48913ceb41c2e433bbcdef5351beb07922\": rpc error: code = NotFound desc = could not find container \"1d309751524719d31ab875d495719d48913ceb41c2e433bbcdef5351beb07922\": container with ID starting with 1d309751524719d31ab875d495719d48913ceb41c2e433bbcdef5351beb07922 not found: ID does not exist" Sep 30 09:13:25 crc kubenswrapper[4760]: I0930 09:13:25.724837 4760 scope.go:117] "RemoveContainer" containerID="62348d17939c2f2fba7b06af529a321c04d12420165dd33fef83ccefc2b5a960" Sep 30 09:13:25 crc kubenswrapper[4760]: E0930 09:13:25.725253 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62348d17939c2f2fba7b06af529a321c04d12420165dd33fef83ccefc2b5a960\": container with ID starting with 62348d17939c2f2fba7b06af529a321c04d12420165dd33fef83ccefc2b5a960 not found: ID does not exist" containerID="62348d17939c2f2fba7b06af529a321c04d12420165dd33fef83ccefc2b5a960" Sep 30 09:13:25 crc kubenswrapper[4760]: I0930 09:13:25.725385 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62348d17939c2f2fba7b06af529a321c04d12420165dd33fef83ccefc2b5a960"} err="failed to get container status \"62348d17939c2f2fba7b06af529a321c04d12420165dd33fef83ccefc2b5a960\": rpc error: code = NotFound desc = could not find container \"62348d17939c2f2fba7b06af529a321c04d12420165dd33fef83ccefc2b5a960\": container with ID starting with 62348d17939c2f2fba7b06af529a321c04d12420165dd33fef83ccefc2b5a960 not found: ID does not exist" Sep 30 09:13:25 crc kubenswrapper[4760]: I0930 09:13:25.725479 4760 scope.go:117] "RemoveContainer" containerID="71986b50faba1e65436c39ab612fbf21bb949d08a5f063154a470f793f398ae2" Sep 30 09:13:25 crc kubenswrapper[4760]: E0930 09:13:25.725835 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71986b50faba1e65436c39ab612fbf21bb949d08a5f063154a470f793f398ae2\": container with ID starting with 71986b50faba1e65436c39ab612fbf21bb949d08a5f063154a470f793f398ae2 not found: ID does not exist" containerID="71986b50faba1e65436c39ab612fbf21bb949d08a5f063154a470f793f398ae2" Sep 30 09:13:25 crc kubenswrapper[4760]: I0930 09:13:25.725873 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71986b50faba1e65436c39ab612fbf21bb949d08a5f063154a470f793f398ae2"} err="failed to get container status \"71986b50faba1e65436c39ab612fbf21bb949d08a5f063154a470f793f398ae2\": rpc error: code = NotFound desc = could not find container \"71986b50faba1e65436c39ab612fbf21bb949d08a5f063154a470f793f398ae2\": container with ID starting with 71986b50faba1e65436c39ab612fbf21bb949d08a5f063154a470f793f398ae2 not found: ID does not exist" Sep 30 09:13:27 crc kubenswrapper[4760]: I0930 09:13:27.086526 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b60bd70e-764e-4eda-a145-a5a8cd93536b" path="/var/lib/kubelet/pods/b60bd70e-764e-4eda-a145-a5a8cd93536b/volumes" Sep 30 09:14:05 crc kubenswrapper[4760]: I0930 09:14:05.129988 4760 generic.go:334] "Generic (PLEG): container finished" podID="7db4983c-9c7f-45b4-847d-c01071cf4c48" containerID="518cb33ba9e23c7b18db81504ce9e92e03d66cbec027c61bed6f33c9873a6aec" exitCode=0 Sep 30 09:14:05 crc kubenswrapper[4760]: I0930 09:14:05.130015 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sjdgg/must-gather-wvnqh" event={"ID":"7db4983c-9c7f-45b4-847d-c01071cf4c48","Type":"ContainerDied","Data":"518cb33ba9e23c7b18db81504ce9e92e03d66cbec027c61bed6f33c9873a6aec"} Sep 30 09:14:05 crc kubenswrapper[4760]: I0930 09:14:05.133164 4760 scope.go:117] "RemoveContainer" containerID="518cb33ba9e23c7b18db81504ce9e92e03d66cbec027c61bed6f33c9873a6aec" Sep 30 09:14:05 crc kubenswrapper[4760]: I0930 09:14:05.948427 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sjdgg_must-gather-wvnqh_7db4983c-9c7f-45b4-847d-c01071cf4c48/gather/0.log" Sep 30 09:14:14 crc kubenswrapper[4760]: I0930 09:14:14.424596 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-sjdgg/must-gather-wvnqh"] Sep 30 09:14:14 crc kubenswrapper[4760]: I0930 09:14:14.425248 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-sjdgg/must-gather-wvnqh" podUID="7db4983c-9c7f-45b4-847d-c01071cf4c48" containerName="copy" containerID="cri-o://9b2e7cc7d231ebf8d3d776c12f2df9b9e1d4487a565e367258b5945a8054b142" gracePeriod=2 Sep 30 09:14:14 crc kubenswrapper[4760]: I0930 09:14:14.435870 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-sjdgg/must-gather-wvnqh"] Sep 30 09:14:14 crc kubenswrapper[4760]: I0930 09:14:14.846048 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sjdgg_must-gather-wvnqh_7db4983c-9c7f-45b4-847d-c01071cf4c48/copy/0.log" Sep 30 09:14:14 crc kubenswrapper[4760]: I0930 09:14:14.846574 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sjdgg/must-gather-wvnqh" Sep 30 09:14:15 crc kubenswrapper[4760]: I0930 09:14:15.002127 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7db4983c-9c7f-45b4-847d-c01071cf4c48-must-gather-output\") pod \"7db4983c-9c7f-45b4-847d-c01071cf4c48\" (UID: \"7db4983c-9c7f-45b4-847d-c01071cf4c48\") " Sep 30 09:14:15 crc kubenswrapper[4760]: I0930 09:14:15.002344 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d57l\" (UniqueName: \"kubernetes.io/projected/7db4983c-9c7f-45b4-847d-c01071cf4c48-kube-api-access-5d57l\") pod \"7db4983c-9c7f-45b4-847d-c01071cf4c48\" (UID: \"7db4983c-9c7f-45b4-847d-c01071cf4c48\") " Sep 30 09:14:15 crc kubenswrapper[4760]: I0930 09:14:15.008944 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7db4983c-9c7f-45b4-847d-c01071cf4c48-kube-api-access-5d57l" (OuterVolumeSpecName: "kube-api-access-5d57l") pod "7db4983c-9c7f-45b4-847d-c01071cf4c48" (UID: "7db4983c-9c7f-45b4-847d-c01071cf4c48"). InnerVolumeSpecName "kube-api-access-5d57l". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 09:14:15 crc kubenswrapper[4760]: I0930 09:14:15.105826 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5d57l\" (UniqueName: \"kubernetes.io/projected/7db4983c-9c7f-45b4-847d-c01071cf4c48-kube-api-access-5d57l\") on node \"crc\" DevicePath \"\"" Sep 30 09:14:15 crc kubenswrapper[4760]: I0930 09:14:15.217604 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7db4983c-9c7f-45b4-847d-c01071cf4c48-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "7db4983c-9c7f-45b4-847d-c01071cf4c48" (UID: "7db4983c-9c7f-45b4-847d-c01071cf4c48"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 09:14:15 crc kubenswrapper[4760]: I0930 09:14:15.262698 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sjdgg_must-gather-wvnqh_7db4983c-9c7f-45b4-847d-c01071cf4c48/copy/0.log" Sep 30 09:14:15 crc kubenswrapper[4760]: I0930 09:14:15.268045 4760 generic.go:334] "Generic (PLEG): container finished" podID="7db4983c-9c7f-45b4-847d-c01071cf4c48" containerID="9b2e7cc7d231ebf8d3d776c12f2df9b9e1d4487a565e367258b5945a8054b142" exitCode=143 Sep 30 09:14:15 crc kubenswrapper[4760]: I0930 09:14:15.268351 4760 scope.go:117] "RemoveContainer" containerID="9b2e7cc7d231ebf8d3d776c12f2df9b9e1d4487a565e367258b5945a8054b142" Sep 30 09:14:15 crc kubenswrapper[4760]: I0930 09:14:15.268570 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sjdgg/must-gather-wvnqh" Sep 30 09:14:15 crc kubenswrapper[4760]: I0930 09:14:15.302206 4760 scope.go:117] "RemoveContainer" containerID="518cb33ba9e23c7b18db81504ce9e92e03d66cbec027c61bed6f33c9873a6aec" Sep 30 09:14:15 crc kubenswrapper[4760]: I0930 09:14:15.323827 4760 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7db4983c-9c7f-45b4-847d-c01071cf4c48-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 30 09:14:15 crc kubenswrapper[4760]: I0930 09:14:15.384367 4760 scope.go:117] "RemoveContainer" containerID="9b2e7cc7d231ebf8d3d776c12f2df9b9e1d4487a565e367258b5945a8054b142" Sep 30 09:14:15 crc kubenswrapper[4760]: E0930 09:14:15.384768 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b2e7cc7d231ebf8d3d776c12f2df9b9e1d4487a565e367258b5945a8054b142\": container with ID starting with 9b2e7cc7d231ebf8d3d776c12f2df9b9e1d4487a565e367258b5945a8054b142 not found: ID does not exist" containerID="9b2e7cc7d231ebf8d3d776c12f2df9b9e1d4487a565e367258b5945a8054b142" Sep 30 09:14:15 crc kubenswrapper[4760]: I0930 09:14:15.384799 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b2e7cc7d231ebf8d3d776c12f2df9b9e1d4487a565e367258b5945a8054b142"} err="failed to get container status \"9b2e7cc7d231ebf8d3d776c12f2df9b9e1d4487a565e367258b5945a8054b142\": rpc error: code = NotFound desc = could not find container \"9b2e7cc7d231ebf8d3d776c12f2df9b9e1d4487a565e367258b5945a8054b142\": container with ID starting with 9b2e7cc7d231ebf8d3d776c12f2df9b9e1d4487a565e367258b5945a8054b142 not found: ID does not exist" Sep 30 09:14:15 crc kubenswrapper[4760]: I0930 09:14:15.384821 4760 scope.go:117] "RemoveContainer" containerID="518cb33ba9e23c7b18db81504ce9e92e03d66cbec027c61bed6f33c9873a6aec" Sep 30 09:14:15 crc kubenswrapper[4760]: E0930 09:14:15.385612 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"518cb33ba9e23c7b18db81504ce9e92e03d66cbec027c61bed6f33c9873a6aec\": container with ID starting with 518cb33ba9e23c7b18db81504ce9e92e03d66cbec027c61bed6f33c9873a6aec not found: ID does not exist" containerID="518cb33ba9e23c7b18db81504ce9e92e03d66cbec027c61bed6f33c9873a6aec" Sep 30 09:14:15 crc kubenswrapper[4760]: I0930 09:14:15.385658 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"518cb33ba9e23c7b18db81504ce9e92e03d66cbec027c61bed6f33c9873a6aec"} err="failed to get container status \"518cb33ba9e23c7b18db81504ce9e92e03d66cbec027c61bed6f33c9873a6aec\": rpc error: code = NotFound desc = could not find container \"518cb33ba9e23c7b18db81504ce9e92e03d66cbec027c61bed6f33c9873a6aec\": container with ID starting with 518cb33ba9e23c7b18db81504ce9e92e03d66cbec027c61bed6f33c9873a6aec not found: ID does not exist" Sep 30 09:14:17 crc kubenswrapper[4760]: I0930 09:14:17.078338 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7db4983c-9c7f-45b4-847d-c01071cf4c48" path="/var/lib/kubelet/pods/7db4983c-9c7f-45b4-847d-c01071cf4c48/volumes" Sep 30 09:14:54 crc kubenswrapper[4760]: I0930 09:14:54.143873 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8vlhv/must-gather-wqqmm"] Sep 30 09:14:54 crc kubenswrapper[4760]: E0930 09:14:54.144939 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7db4983c-9c7f-45b4-847d-c01071cf4c48" containerName="gather" Sep 30 09:14:54 crc kubenswrapper[4760]: I0930 09:14:54.144954 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db4983c-9c7f-45b4-847d-c01071cf4c48" containerName="gather" Sep 30 09:14:54 crc kubenswrapper[4760]: E0930 09:14:54.144987 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b60bd70e-764e-4eda-a145-a5a8cd93536b" containerName="extract-utilities" Sep 30 09:14:54 crc kubenswrapper[4760]: I0930 09:14:54.144995 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="b60bd70e-764e-4eda-a145-a5a8cd93536b" containerName="extract-utilities" Sep 30 09:14:54 crc kubenswrapper[4760]: E0930 09:14:54.145007 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b60bd70e-764e-4eda-a145-a5a8cd93536b" containerName="registry-server" Sep 30 09:14:54 crc kubenswrapper[4760]: I0930 09:14:54.145016 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="b60bd70e-764e-4eda-a145-a5a8cd93536b" containerName="registry-server" Sep 30 09:14:54 crc kubenswrapper[4760]: E0930 09:14:54.145032 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7db4983c-9c7f-45b4-847d-c01071cf4c48" containerName="copy" Sep 30 09:14:54 crc kubenswrapper[4760]: I0930 09:14:54.145039 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db4983c-9c7f-45b4-847d-c01071cf4c48" containerName="copy" Sep 30 09:14:54 crc kubenswrapper[4760]: E0930 09:14:54.145061 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b60bd70e-764e-4eda-a145-a5a8cd93536b" containerName="extract-content" Sep 30 09:14:54 crc kubenswrapper[4760]: I0930 09:14:54.145069 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="b60bd70e-764e-4eda-a145-a5a8cd93536b" containerName="extract-content" Sep 30 09:14:54 crc kubenswrapper[4760]: E0930 09:14:54.145097 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9501998-5e1e-4199-b05c-56e913ff558f" containerName="extract-content" Sep 30 09:14:54 crc kubenswrapper[4760]: I0930 09:14:54.145105 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9501998-5e1e-4199-b05c-56e913ff558f" containerName="extract-content" Sep 30 09:14:54 crc kubenswrapper[4760]: E0930 09:14:54.145123 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9501998-5e1e-4199-b05c-56e913ff558f" containerName="extract-utilities" Sep 30 09:14:54 crc kubenswrapper[4760]: I0930 09:14:54.145130 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9501998-5e1e-4199-b05c-56e913ff558f" containerName="extract-utilities" Sep 30 09:14:54 crc kubenswrapper[4760]: E0930 09:14:54.145142 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9501998-5e1e-4199-b05c-56e913ff558f" containerName="registry-server" Sep 30 09:14:54 crc kubenswrapper[4760]: I0930 09:14:54.145151 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9501998-5e1e-4199-b05c-56e913ff558f" containerName="registry-server" Sep 30 09:14:54 crc kubenswrapper[4760]: I0930 09:14:54.145428 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="7db4983c-9c7f-45b4-847d-c01071cf4c48" containerName="gather" Sep 30 09:14:54 crc kubenswrapper[4760]: I0930 09:14:54.145447 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9501998-5e1e-4199-b05c-56e913ff558f" containerName="registry-server" Sep 30 09:14:54 crc kubenswrapper[4760]: I0930 09:14:54.145465 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="b60bd70e-764e-4eda-a145-a5a8cd93536b" containerName="registry-server" Sep 30 09:14:54 crc kubenswrapper[4760]: I0930 09:14:54.145483 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="7db4983c-9c7f-45b4-847d-c01071cf4c48" containerName="copy" Sep 30 09:14:54 crc kubenswrapper[4760]: I0930 09:14:54.146753 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8vlhv/must-gather-wqqmm" Sep 30 09:14:54 crc kubenswrapper[4760]: I0930 09:14:54.149734 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8vlhv"/"default-dockercfg-tsmts" Sep 30 09:14:54 crc kubenswrapper[4760]: I0930 09:14:54.150088 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8vlhv"/"openshift-service-ca.crt" Sep 30 09:14:54 crc kubenswrapper[4760]: I0930 09:14:54.150439 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8vlhv"/"kube-root-ca.crt" Sep 30 09:14:54 crc kubenswrapper[4760]: I0930 09:14:54.159363 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7de214e2-d30b-4290-a9ea-64bed22298e0-must-gather-output\") pod \"must-gather-wqqmm\" (UID: \"7de214e2-d30b-4290-a9ea-64bed22298e0\") " pod="openshift-must-gather-8vlhv/must-gather-wqqmm" Sep 30 09:14:54 crc kubenswrapper[4760]: I0930 09:14:54.159415 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcrcg\" (UniqueName: \"kubernetes.io/projected/7de214e2-d30b-4290-a9ea-64bed22298e0-kube-api-access-hcrcg\") pod \"must-gather-wqqmm\" (UID: \"7de214e2-d30b-4290-a9ea-64bed22298e0\") " pod="openshift-must-gather-8vlhv/must-gather-wqqmm" Sep 30 09:14:54 crc kubenswrapper[4760]: I0930 09:14:54.169189 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8vlhv/must-gather-wqqmm"] Sep 30 09:14:54 crc kubenswrapper[4760]: I0930 09:14:54.261205 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7de214e2-d30b-4290-a9ea-64bed22298e0-must-gather-output\") pod \"must-gather-wqqmm\" (UID: \"7de214e2-d30b-4290-a9ea-64bed22298e0\") " pod="openshift-must-gather-8vlhv/must-gather-wqqmm" Sep 30 09:14:54 crc kubenswrapper[4760]: I0930 09:14:54.261262 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcrcg\" (UniqueName: \"kubernetes.io/projected/7de214e2-d30b-4290-a9ea-64bed22298e0-kube-api-access-hcrcg\") pod \"must-gather-wqqmm\" (UID: \"7de214e2-d30b-4290-a9ea-64bed22298e0\") " pod="openshift-must-gather-8vlhv/must-gather-wqqmm" Sep 30 09:14:54 crc kubenswrapper[4760]: I0930 09:14:54.262028 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7de214e2-d30b-4290-a9ea-64bed22298e0-must-gather-output\") pod \"must-gather-wqqmm\" (UID: \"7de214e2-d30b-4290-a9ea-64bed22298e0\") " pod="openshift-must-gather-8vlhv/must-gather-wqqmm" Sep 30 09:14:54 crc kubenswrapper[4760]: I0930 09:14:54.285003 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcrcg\" (UniqueName: \"kubernetes.io/projected/7de214e2-d30b-4290-a9ea-64bed22298e0-kube-api-access-hcrcg\") pod \"must-gather-wqqmm\" (UID: \"7de214e2-d30b-4290-a9ea-64bed22298e0\") " pod="openshift-must-gather-8vlhv/must-gather-wqqmm" Sep 30 09:14:54 crc kubenswrapper[4760]: I0930 09:14:54.488190 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8vlhv/must-gather-wqqmm" Sep 30 09:14:54 crc kubenswrapper[4760]: I0930 09:14:54.992728 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8vlhv/must-gather-wqqmm"] Sep 30 09:14:55 crc kubenswrapper[4760]: I0930 09:14:55.722443 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8vlhv/must-gather-wqqmm" event={"ID":"7de214e2-d30b-4290-a9ea-64bed22298e0","Type":"ContainerStarted","Data":"e3c9eed636e9a52d22554f034f5abed6d00d4faf7600cee3ca414f250933e307"} Sep 30 09:14:55 crc kubenswrapper[4760]: I0930 09:14:55.723449 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8vlhv/must-gather-wqqmm" event={"ID":"7de214e2-d30b-4290-a9ea-64bed22298e0","Type":"ContainerStarted","Data":"551cd00196f68620a332e6be64b24e9c17ca55bf363fa96b6da3e8006a504b05"} Sep 30 09:14:55 crc kubenswrapper[4760]: I0930 09:14:55.723469 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8vlhv/must-gather-wqqmm" event={"ID":"7de214e2-d30b-4290-a9ea-64bed22298e0","Type":"ContainerStarted","Data":"e936ea754002bef8c4342cdc923287c253033578b762c97e5206643546f13baf"} Sep 30 09:14:55 crc kubenswrapper[4760]: I0930 09:14:55.747180 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8vlhv/must-gather-wqqmm" podStartSLOduration=1.747160317 podStartE2EDuration="1.747160317s" podCreationTimestamp="2025-09-30 09:14:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 09:14:55.735844989 +0000 UTC m=+6081.378751411" watchObservedRunningTime="2025-09-30 09:14:55.747160317 +0000 UTC m=+6081.390066729" Sep 30 09:14:58 crc kubenswrapper[4760]: I0930 09:14:58.851286 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8vlhv/crc-debug-s94t5"] Sep 30 09:14:58 crc kubenswrapper[4760]: I0930 09:14:58.853210 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8vlhv/crc-debug-s94t5" Sep 30 09:14:58 crc kubenswrapper[4760]: I0930 09:14:58.977848 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81124952-71e6-4e83-9f31-423ee4cde133-host\") pod \"crc-debug-s94t5\" (UID: \"81124952-71e6-4e83-9f31-423ee4cde133\") " pod="openshift-must-gather-8vlhv/crc-debug-s94t5" Sep 30 09:14:58 crc kubenswrapper[4760]: I0930 09:14:58.978356 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6bnb\" (UniqueName: \"kubernetes.io/projected/81124952-71e6-4e83-9f31-423ee4cde133-kube-api-access-s6bnb\") pod \"crc-debug-s94t5\" (UID: \"81124952-71e6-4e83-9f31-423ee4cde133\") " pod="openshift-must-gather-8vlhv/crc-debug-s94t5" Sep 30 09:14:59 crc kubenswrapper[4760]: I0930 09:14:59.080033 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81124952-71e6-4e83-9f31-423ee4cde133-host\") pod \"crc-debug-s94t5\" (UID: \"81124952-71e6-4e83-9f31-423ee4cde133\") " pod="openshift-must-gather-8vlhv/crc-debug-s94t5" Sep 30 09:14:59 crc kubenswrapper[4760]: I0930 09:14:59.080144 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6bnb\" (UniqueName: \"kubernetes.io/projected/81124952-71e6-4e83-9f31-423ee4cde133-kube-api-access-s6bnb\") pod \"crc-debug-s94t5\" (UID: \"81124952-71e6-4e83-9f31-423ee4cde133\") " pod="openshift-must-gather-8vlhv/crc-debug-s94t5" Sep 30 09:14:59 crc kubenswrapper[4760]: I0930 09:14:59.080197 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81124952-71e6-4e83-9f31-423ee4cde133-host\") pod \"crc-debug-s94t5\" (UID: \"81124952-71e6-4e83-9f31-423ee4cde133\") " pod="openshift-must-gather-8vlhv/crc-debug-s94t5" Sep 30 09:14:59 crc kubenswrapper[4760]: I0930 09:14:59.109962 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6bnb\" (UniqueName: \"kubernetes.io/projected/81124952-71e6-4e83-9f31-423ee4cde133-kube-api-access-s6bnb\") pod \"crc-debug-s94t5\" (UID: \"81124952-71e6-4e83-9f31-423ee4cde133\") " pod="openshift-must-gather-8vlhv/crc-debug-s94t5" Sep 30 09:14:59 crc kubenswrapper[4760]: I0930 09:14:59.178718 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8vlhv/crc-debug-s94t5" Sep 30 09:14:59 crc kubenswrapper[4760]: W0930 09:14:59.219505 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81124952_71e6_4e83_9f31_423ee4cde133.slice/crio-290b4dbbce7156e6958a963dc0e49a5fabdfb946342497a7228f59fe2fd508f3 WatchSource:0}: Error finding container 290b4dbbce7156e6958a963dc0e49a5fabdfb946342497a7228f59fe2fd508f3: Status 404 returned error can't find the container with id 290b4dbbce7156e6958a963dc0e49a5fabdfb946342497a7228f59fe2fd508f3 Sep 30 09:14:59 crc kubenswrapper[4760]: I0930 09:14:59.758283 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8vlhv/crc-debug-s94t5" event={"ID":"81124952-71e6-4e83-9f31-423ee4cde133","Type":"ContainerStarted","Data":"cd48d0d93161a5653ba46822c329ddef4de9a7347e71432ce72258e2c257992e"} Sep 30 09:14:59 crc kubenswrapper[4760]: I0930 09:14:59.758828 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8vlhv/crc-debug-s94t5" event={"ID":"81124952-71e6-4e83-9f31-423ee4cde133","Type":"ContainerStarted","Data":"290b4dbbce7156e6958a963dc0e49a5fabdfb946342497a7228f59fe2fd508f3"} Sep 30 09:14:59 crc kubenswrapper[4760]: I0930 09:14:59.778004 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8vlhv/crc-debug-s94t5" podStartSLOduration=1.777984587 podStartE2EDuration="1.777984587s" podCreationTimestamp="2025-09-30 09:14:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 09:14:59.768829774 +0000 UTC m=+6085.411736196" watchObservedRunningTime="2025-09-30 09:14:59.777984587 +0000 UTC m=+6085.420890999" Sep 30 09:15:00 crc kubenswrapper[4760]: I0930 09:15:00.162694 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320395-jf4ml"] Sep 30 09:15:00 crc kubenswrapper[4760]: I0930 09:15:00.164710 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320395-jf4ml" Sep 30 09:15:00 crc kubenswrapper[4760]: I0930 09:15:00.167640 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 09:15:00 crc kubenswrapper[4760]: I0930 09:15:00.167829 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 09:15:00 crc kubenswrapper[4760]: I0930 09:15:00.171487 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320395-jf4ml"] Sep 30 09:15:00 crc kubenswrapper[4760]: I0930 09:15:00.301184 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l72z6\" (UniqueName: \"kubernetes.io/projected/10799bad-eac6-4db4-86e0-377983dd439d-kube-api-access-l72z6\") pod \"collect-profiles-29320395-jf4ml\" (UID: \"10799bad-eac6-4db4-86e0-377983dd439d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320395-jf4ml" Sep 30 09:15:00 crc kubenswrapper[4760]: I0930 09:15:00.301319 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10799bad-eac6-4db4-86e0-377983dd439d-secret-volume\") pod \"collect-profiles-29320395-jf4ml\" (UID: \"10799bad-eac6-4db4-86e0-377983dd439d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320395-jf4ml" Sep 30 09:15:00 crc kubenswrapper[4760]: I0930 09:15:00.301493 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10799bad-eac6-4db4-86e0-377983dd439d-config-volume\") pod \"collect-profiles-29320395-jf4ml\" (UID: \"10799bad-eac6-4db4-86e0-377983dd439d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320395-jf4ml" Sep 30 09:15:00 crc kubenswrapper[4760]: I0930 09:15:00.403526 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10799bad-eac6-4db4-86e0-377983dd439d-config-volume\") pod \"collect-profiles-29320395-jf4ml\" (UID: \"10799bad-eac6-4db4-86e0-377983dd439d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320395-jf4ml" Sep 30 09:15:00 crc kubenswrapper[4760]: I0930 09:15:00.403624 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l72z6\" (UniqueName: \"kubernetes.io/projected/10799bad-eac6-4db4-86e0-377983dd439d-kube-api-access-l72z6\") pod \"collect-profiles-29320395-jf4ml\" (UID: \"10799bad-eac6-4db4-86e0-377983dd439d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320395-jf4ml" Sep 30 09:15:00 crc kubenswrapper[4760]: I0930 09:15:00.403709 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10799bad-eac6-4db4-86e0-377983dd439d-secret-volume\") pod \"collect-profiles-29320395-jf4ml\" (UID: \"10799bad-eac6-4db4-86e0-377983dd439d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320395-jf4ml" Sep 30 09:15:00 crc kubenswrapper[4760]: I0930 09:15:00.404591 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10799bad-eac6-4db4-86e0-377983dd439d-config-volume\") pod \"collect-profiles-29320395-jf4ml\" (UID: \"10799bad-eac6-4db4-86e0-377983dd439d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320395-jf4ml" Sep 30 09:15:00 crc kubenswrapper[4760]: I0930 09:15:00.411007 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10799bad-eac6-4db4-86e0-377983dd439d-secret-volume\") pod \"collect-profiles-29320395-jf4ml\" (UID: \"10799bad-eac6-4db4-86e0-377983dd439d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320395-jf4ml" Sep 30 09:15:00 crc kubenswrapper[4760]: I0930 09:15:00.420678 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l72z6\" (UniqueName: \"kubernetes.io/projected/10799bad-eac6-4db4-86e0-377983dd439d-kube-api-access-l72z6\") pod \"collect-profiles-29320395-jf4ml\" (UID: \"10799bad-eac6-4db4-86e0-377983dd439d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320395-jf4ml" Sep 30 09:15:00 crc kubenswrapper[4760]: I0930 09:15:00.492160 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320395-jf4ml" Sep 30 09:15:01 crc kubenswrapper[4760]: I0930 09:15:01.009221 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320395-jf4ml"] Sep 30 09:15:01 crc kubenswrapper[4760]: I0930 09:15:01.789904 4760 generic.go:334] "Generic (PLEG): container finished" podID="10799bad-eac6-4db4-86e0-377983dd439d" containerID="a968c1512deb33917728e136fe8142d293548e0c5ec207f040eeeba46a574093" exitCode=0 Sep 30 09:15:01 crc kubenswrapper[4760]: I0930 09:15:01.789957 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320395-jf4ml" event={"ID":"10799bad-eac6-4db4-86e0-377983dd439d","Type":"ContainerDied","Data":"a968c1512deb33917728e136fe8142d293548e0c5ec207f040eeeba46a574093"} Sep 30 09:15:01 crc kubenswrapper[4760]: I0930 09:15:01.790461 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320395-jf4ml" event={"ID":"10799bad-eac6-4db4-86e0-377983dd439d","Type":"ContainerStarted","Data":"a305f17af2f82e6259f670d6da903522557374a6331528e1a30868bc012e2b35"} Sep 30 09:15:03 crc kubenswrapper[4760]: I0930 09:15:03.195152 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320395-jf4ml" Sep 30 09:15:03 crc kubenswrapper[4760]: I0930 09:15:03.374283 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l72z6\" (UniqueName: \"kubernetes.io/projected/10799bad-eac6-4db4-86e0-377983dd439d-kube-api-access-l72z6\") pod \"10799bad-eac6-4db4-86e0-377983dd439d\" (UID: \"10799bad-eac6-4db4-86e0-377983dd439d\") " Sep 30 09:15:03 crc kubenswrapper[4760]: I0930 09:15:03.374391 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10799bad-eac6-4db4-86e0-377983dd439d-secret-volume\") pod \"10799bad-eac6-4db4-86e0-377983dd439d\" (UID: \"10799bad-eac6-4db4-86e0-377983dd439d\") " Sep 30 09:15:03 crc kubenswrapper[4760]: I0930 09:15:03.374484 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10799bad-eac6-4db4-86e0-377983dd439d-config-volume\") pod \"10799bad-eac6-4db4-86e0-377983dd439d\" (UID: \"10799bad-eac6-4db4-86e0-377983dd439d\") " Sep 30 09:15:03 crc kubenswrapper[4760]: I0930 09:15:03.375354 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10799bad-eac6-4db4-86e0-377983dd439d-config-volume" (OuterVolumeSpecName: "config-volume") pod "10799bad-eac6-4db4-86e0-377983dd439d" (UID: "10799bad-eac6-4db4-86e0-377983dd439d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 09:15:03 crc kubenswrapper[4760]: I0930 09:15:03.380607 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10799bad-eac6-4db4-86e0-377983dd439d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "10799bad-eac6-4db4-86e0-377983dd439d" (UID: "10799bad-eac6-4db4-86e0-377983dd439d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 09:15:03 crc kubenswrapper[4760]: I0930 09:15:03.384464 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10799bad-eac6-4db4-86e0-377983dd439d-kube-api-access-l72z6" (OuterVolumeSpecName: "kube-api-access-l72z6") pod "10799bad-eac6-4db4-86e0-377983dd439d" (UID: "10799bad-eac6-4db4-86e0-377983dd439d"). InnerVolumeSpecName "kube-api-access-l72z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 09:15:03 crc kubenswrapper[4760]: I0930 09:15:03.476898 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l72z6\" (UniqueName: \"kubernetes.io/projected/10799bad-eac6-4db4-86e0-377983dd439d-kube-api-access-l72z6\") on node \"crc\" DevicePath \"\"" Sep 30 09:15:03 crc kubenswrapper[4760]: I0930 09:15:03.476941 4760 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10799bad-eac6-4db4-86e0-377983dd439d-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 09:15:03 crc kubenswrapper[4760]: I0930 09:15:03.476953 4760 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10799bad-eac6-4db4-86e0-377983dd439d-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 09:15:03 crc kubenswrapper[4760]: I0930 09:15:03.808700 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320395-jf4ml" event={"ID":"10799bad-eac6-4db4-86e0-377983dd439d","Type":"ContainerDied","Data":"a305f17af2f82e6259f670d6da903522557374a6331528e1a30868bc012e2b35"} Sep 30 09:15:03 crc kubenswrapper[4760]: I0930 09:15:03.808739 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a305f17af2f82e6259f670d6da903522557374a6331528e1a30868bc012e2b35" Sep 30 09:15:03 crc kubenswrapper[4760]: I0930 09:15:03.808768 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320395-jf4ml" Sep 30 09:15:04 crc kubenswrapper[4760]: I0930 09:15:04.267385 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320350-mwrrt"] Sep 30 09:15:04 crc kubenswrapper[4760]: I0930 09:15:04.274790 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320350-mwrrt"] Sep 30 09:15:05 crc kubenswrapper[4760]: I0930 09:15:05.078613 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27bcbc4c-bacd-4e8f-a452-26ab86ade323" path="/var/lib/kubelet/pods/27bcbc4c-bacd-4e8f-a452-26ab86ade323/volumes" Sep 30 09:15:48 crc kubenswrapper[4760]: I0930 09:15:48.288036 4760 scope.go:117] "RemoveContainer" containerID="1bbf173643742246f3e6b7636e45bed0c56428287bfc93470061397958c914bb" Sep 30 09:15:49 crc kubenswrapper[4760]: I0930 09:15:49.118327 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 09:15:49 crc kubenswrapper[4760]: I0930 09:15:49.118596 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 09:16:18 crc kubenswrapper[4760]: I0930 09:16:18.870354 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-65c7cb7cc8-cjqdk_59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3/barbican-api-log/0.log" Sep 30 09:16:18 crc kubenswrapper[4760]: I0930 09:16:18.931168 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-65c7cb7cc8-cjqdk_59d36e9c-c1d2-4da4-be6b-ddfbc9fd76b3/barbican-api/0.log" Sep 30 09:16:19 crc kubenswrapper[4760]: I0930 09:16:19.110816 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-554bb7d464-zcqc8_968a427d-0cee-4775-ab7f-4ec27e535b33/barbican-keystone-listener/0.log" Sep 30 09:16:19 crc kubenswrapper[4760]: I0930 09:16:19.112675 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 09:16:19 crc kubenswrapper[4760]: I0930 09:16:19.112855 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 09:16:19 crc kubenswrapper[4760]: I0930 09:16:19.188527 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-554bb7d464-zcqc8_968a427d-0cee-4775-ab7f-4ec27e535b33/barbican-keystone-listener-log/0.log" Sep 30 09:16:19 crc kubenswrapper[4760]: I0930 09:16:19.380375 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-8479dd9dbc-25wxx_caf10164-5c77-42df-9fdc-b6a1764a0e3d/barbican-worker/0.log" Sep 30 09:16:19 crc kubenswrapper[4760]: I0930 09:16:19.446338 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-8479dd9dbc-25wxx_caf10164-5c77-42df-9fdc-b6a1764a0e3d/barbican-worker-log/0.log" Sep 30 09:16:19 crc kubenswrapper[4760]: I0930 09:16:19.898547 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-fnxp6_64e019eb-1763-4e9e-8c00-c4312d782981/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:16:20 crc kubenswrapper[4760]: I0930 09:16:20.034440 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5c70743c-2be6-4c97-aaa8-fe22bd306c7d/ceilometer-central-agent/0.log" Sep 30 09:16:20 crc kubenswrapper[4760]: I0930 09:16:20.123486 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5c70743c-2be6-4c97-aaa8-fe22bd306c7d/ceilometer-notification-agent/0.log" Sep 30 09:16:20 crc kubenswrapper[4760]: I0930 09:16:20.156149 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5c70743c-2be6-4c97-aaa8-fe22bd306c7d/proxy-httpd/0.log" Sep 30 09:16:20 crc kubenswrapper[4760]: I0930 09:16:20.244827 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5c70743c-2be6-4c97-aaa8-fe22bd306c7d/sg-core/0.log" Sep 30 09:16:20 crc kubenswrapper[4760]: I0930 09:16:20.448540 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ce9e61a5-93ea-4bc8-bb73-0578fe123aae/cinder-api/0.log" Sep 30 09:16:20 crc kubenswrapper[4760]: I0930 09:16:20.499702 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ce9e61a5-93ea-4bc8-bb73-0578fe123aae/cinder-api-log/0.log" Sep 30 09:16:20 crc kubenswrapper[4760]: I0930 09:16:20.700351 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d3817395-40ad-472b-b4df-83a7386bb16f/cinder-scheduler/0.log" Sep 30 09:16:20 crc kubenswrapper[4760]: I0930 09:16:20.786705 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d3817395-40ad-472b-b4df-83a7386bb16f/probe/0.log" Sep 30 09:16:20 crc kubenswrapper[4760]: I0930 09:16:20.966924 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-sj6c4_e2439cde-d5f2-423a-9e6d-4af8d713c917/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:16:21 crc kubenswrapper[4760]: I0930 09:16:21.099924 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-mpw2v_fc1682c5-7e4d-43a1-89f4-b40761683742/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:16:21 crc kubenswrapper[4760]: I0930 09:16:21.289504 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7c8665b49f-cp9sh_5e0ff1e1-cda6-4574-a353-f4a7406326e7/init/0.log" Sep 30 09:16:21 crc kubenswrapper[4760]: I0930 09:16:21.451719 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7c8665b49f-cp9sh_5e0ff1e1-cda6-4574-a353-f4a7406326e7/init/0.log" Sep 30 09:16:21 crc kubenswrapper[4760]: I0930 09:16:21.697716 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7c8665b49f-cp9sh_5e0ff1e1-cda6-4574-a353-f4a7406326e7/dnsmasq-dns/0.log" Sep 30 09:16:21 crc kubenswrapper[4760]: I0930 09:16:21.741171 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-2xvvd_87fc7cca-6571-4e27-ab1e-14648064566e/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:16:21 crc kubenswrapper[4760]: I0930 09:16:21.969123 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_19ddce20-85ae-4537-86f4-33a6b35fef0b/glance-log/0.log" Sep 30 09:16:22 crc kubenswrapper[4760]: I0930 09:16:22.002987 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_19ddce20-85ae-4537-86f4-33a6b35fef0b/glance-httpd/0.log" Sep 30 09:16:22 crc kubenswrapper[4760]: I0930 09:16:22.161552 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_eec5d0ef-04f3-4a34-8575-45e2a88c519f/glance-httpd/0.log" Sep 30 09:16:22 crc kubenswrapper[4760]: I0930 09:16:22.202335 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_eec5d0ef-04f3-4a34-8575-45e2a88c519f/glance-log/0.log" Sep 30 09:16:22 crc kubenswrapper[4760]: I0930 09:16:22.363617 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-75644c8bb4-wrsmv_8b39ba3e-25df-4a22-a1fe-f15e6ca1fada/horizon/0.log" Sep 30 09:16:22 crc kubenswrapper[4760]: I0930 09:16:22.538520 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-vkv7v_e0989b93-a567-4aa1-886e-43b6fa827891/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:16:22 crc kubenswrapper[4760]: I0930 09:16:22.925723 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-zxgfv_23f502d4-3801-4388-b442-22f60146dcf2/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:16:23 crc kubenswrapper[4760]: I0930 09:16:23.250720 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-75644c8bb4-wrsmv_8b39ba3e-25df-4a22-a1fe-f15e6ca1fada/horizon-log/0.log" Sep 30 09:16:23 crc kubenswrapper[4760]: I0930 09:16:23.358477 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29320321-w9cp7_2a6d4144-48a7-412d-9288-a909f1fbd5f4/keystone-cron/0.log" Sep 30 09:16:23 crc kubenswrapper[4760]: I0930 09:16:23.453050 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-77987c8bb7-t2mw2_9d49cf7a-b821-4677-88fe-8fac1dbced63/keystone-api/0.log" Sep 30 09:16:23 crc kubenswrapper[4760]: I0930 09:16:23.478679 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29320381-pcwcm_aa755a39-f3ae-49a4-80ce-a0efcfe2566e/keystone-cron/0.log" Sep 30 09:16:23 crc kubenswrapper[4760]: I0930 09:16:23.629039 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_4fc405db-b61d-4077-b14d-ef2b4eea924c/kube-state-metrics/0.log" Sep 30 09:16:23 crc kubenswrapper[4760]: I0930 09:16:23.754826 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-s99vh_d904db1f-5f11-47d3-8823-ff59f4bed296/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:16:24 crc kubenswrapper[4760]: I0930 09:16:24.321753 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-55ffd7b5b9-x7zhf_6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2/neutron-httpd/0.log" Sep 30 09:16:24 crc kubenswrapper[4760]: I0930 09:16:24.391234 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-55ffd7b5b9-x7zhf_6ba8e9e7-1b15-4283-9a3a-0e9515d10bd2/neutron-api/0.log" Sep 30 09:16:24 crc kubenswrapper[4760]: I0930 09:16:24.441456 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-sz5l4_5259c092-63b5-4574-b14a-725c45523773/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:16:25 crc kubenswrapper[4760]: I0930 09:16:25.434651 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_d1669957-f301-409c-8f6b-e1b87dfadeb7/nova-cell0-conductor-conductor/0.log" Sep 30 09:16:25 crc kubenswrapper[4760]: I0930 09:16:25.938033 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_c27b43ce-27fb-4163-b55a-98a7e9ee7d71/nova-api-log/0.log" Sep 30 09:16:26 crc kubenswrapper[4760]: I0930 09:16:26.111929 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_998c9109-185a-425a-bda1-12eb13c83ca7/nova-cell1-conductor-conductor/0.log" Sep 30 09:16:26 crc kubenswrapper[4760]: I0930 09:16:26.461008 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_c27b43ce-27fb-4163-b55a-98a7e9ee7d71/nova-api-api/0.log" Sep 30 09:16:26 crc kubenswrapper[4760]: I0930 09:16:26.499907 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_f7ab1b57-8aaa-4360-b024-fa2142ebd994/nova-cell1-novncproxy-novncproxy/0.log" Sep 30 09:16:26 crc kubenswrapper[4760]: I0930 09:16:26.944712 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-5ncgd_33cc4d6c-b086-410c-b38e-f6c918657a74/nova-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:16:27 crc kubenswrapper[4760]: I0930 09:16:27.038620 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_10427d42-4cfc-486e-931c-fd62a2a5b1e5/nova-metadata-log/0.log" Sep 30 09:16:27 crc kubenswrapper[4760]: I0930 09:16:27.655025 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_868868bd-3879-4d24-9dd1-62218a15844c/nova-scheduler-scheduler/0.log" Sep 30 09:16:27 crc kubenswrapper[4760]: I0930 09:16:27.717994 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_894abb89-f647-4143-904c-88b5108982cd/mysql-bootstrap/0.log" Sep 30 09:16:27 crc kubenswrapper[4760]: I0930 09:16:27.912810 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_894abb89-f647-4143-904c-88b5108982cd/mysql-bootstrap/0.log" Sep 30 09:16:27 crc kubenswrapper[4760]: I0930 09:16:27.983638 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_894abb89-f647-4143-904c-88b5108982cd/galera/0.log" Sep 30 09:16:28 crc kubenswrapper[4760]: I0930 09:16:28.210031 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_641818bf-a81e-4654-a8f7-c8d06fbefc6c/mysql-bootstrap/0.log" Sep 30 09:16:28 crc kubenswrapper[4760]: I0930 09:16:28.440473 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_641818bf-a81e-4654-a8f7-c8d06fbefc6c/mysql-bootstrap/0.log" Sep 30 09:16:28 crc kubenswrapper[4760]: I0930 09:16:28.465589 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_641818bf-a81e-4654-a8f7-c8d06fbefc6c/galera/0.log" Sep 30 09:16:28 crc kubenswrapper[4760]: I0930 09:16:28.688998 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_1c5867a3-c734-489e-a6b3-edb023949556/openstackclient/0.log" Sep 30 09:16:28 crc kubenswrapper[4760]: I0930 09:16:28.941338 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-56wgh_159ee554-1b62-4fe3-95c6-e64ab0c58b2d/ovn-controller/0.log" Sep 30 09:16:29 crc kubenswrapper[4760]: I0930 09:16:29.144641 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-k6hgt_83918139-1a35-439f-8f7c-cd46d6e21064/openstack-network-exporter/0.log" Sep 30 09:16:29 crc kubenswrapper[4760]: I0930 09:16:29.453929 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bwrv9_8735bd7c-231f-47df-a404-b8cab84f0d7b/ovsdb-server-init/0.log" Sep 30 09:16:29 crc kubenswrapper[4760]: I0930 09:16:29.637216 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_10427d42-4cfc-486e-931c-fd62a2a5b1e5/nova-metadata-metadata/0.log" Sep 30 09:16:29 crc kubenswrapper[4760]: I0930 09:16:29.782333 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bwrv9_8735bd7c-231f-47df-a404-b8cab84f0d7b/ovs-vswitchd/0.log" Sep 30 09:16:29 crc kubenswrapper[4760]: I0930 09:16:29.821809 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bwrv9_8735bd7c-231f-47df-a404-b8cab84f0d7b/ovsdb-server-init/0.log" Sep 30 09:16:29 crc kubenswrapper[4760]: I0930 09:16:29.845991 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bwrv9_8735bd7c-231f-47df-a404-b8cab84f0d7b/ovsdb-server/0.log" Sep 30 09:16:30 crc kubenswrapper[4760]: I0930 09:16:30.148388 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-69ht5_0f077fda-e7af-42a5-9d0b-f007910f6948/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:16:30 crc kubenswrapper[4760]: I0930 09:16:30.508160 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_96ee48d1-c16e-4367-9159-0f9ddaf5e66a/openstack-network-exporter/0.log" Sep 30 09:16:30 crc kubenswrapper[4760]: I0930 09:16:30.644740 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_96ee48d1-c16e-4367-9159-0f9ddaf5e66a/ovn-northd/0.log" Sep 30 09:16:30 crc kubenswrapper[4760]: I0930 09:16:30.758606 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d25b3b00-98d6-4bfc-8218-9ea7319e1c60/openstack-network-exporter/0.log" Sep 30 09:16:30 crc kubenswrapper[4760]: I0930 09:16:30.898739 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d25b3b00-98d6-4bfc-8218-9ea7319e1c60/ovsdbserver-nb/0.log" Sep 30 09:16:30 crc kubenswrapper[4760]: I0930 09:16:30.927651 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d2f7adda-d2ed-4c87-8e63-64e344155305/openstack-network-exporter/0.log" Sep 30 09:16:31 crc kubenswrapper[4760]: I0930 09:16:31.166789 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d2f7adda-d2ed-4c87-8e63-64e344155305/ovsdbserver-sb/0.log" Sep 30 09:16:31 crc kubenswrapper[4760]: I0930 09:16:31.515906 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-844b758db4-hzncj_a6cfc37b-8ee0-4efe-a43f-b53bafbf4255/placement-api/0.log" Sep 30 09:16:31 crc kubenswrapper[4760]: I0930 09:16:31.672992 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-844b758db4-hzncj_a6cfc37b-8ee0-4efe-a43f-b53bafbf4255/placement-log/0.log" Sep 30 09:16:31 crc kubenswrapper[4760]: I0930 09:16:31.683400 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_33ff1fe6-1a25-49ab-8b11-97ab06ee2e43/init-config-reloader/0.log" Sep 30 09:16:31 crc kubenswrapper[4760]: I0930 09:16:31.856465 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_33ff1fe6-1a25-49ab-8b11-97ab06ee2e43/init-config-reloader/0.log" Sep 30 09:16:31 crc kubenswrapper[4760]: I0930 09:16:31.901700 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_33ff1fe6-1a25-49ab-8b11-97ab06ee2e43/config-reloader/0.log" Sep 30 09:16:31 crc kubenswrapper[4760]: I0930 09:16:31.935195 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_33ff1fe6-1a25-49ab-8b11-97ab06ee2e43/prometheus/0.log" Sep 30 09:16:32 crc kubenswrapper[4760]: I0930 09:16:32.034436 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_33ff1fe6-1a25-49ab-8b11-97ab06ee2e43/thanos-sidecar/0.log" Sep 30 09:16:32 crc kubenswrapper[4760]: I0930 09:16:32.200704 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_2750016d-97a4-4e2b-a0e8-a03ddd6d64bb/setup-container/0.log" Sep 30 09:16:32 crc kubenswrapper[4760]: I0930 09:16:32.368278 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_2750016d-97a4-4e2b-a0e8-a03ddd6d64bb/setup-container/0.log" Sep 30 09:16:32 crc kubenswrapper[4760]: I0930 09:16:32.402342 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_2750016d-97a4-4e2b-a0e8-a03ddd6d64bb/rabbitmq/0.log" Sep 30 09:16:32 crc kubenswrapper[4760]: I0930 09:16:32.634074 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac/setup-container/0.log" Sep 30 09:16:32 crc kubenswrapper[4760]: I0930 09:16:32.773051 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac/setup-container/0.log" Sep 30 09:16:32 crc kubenswrapper[4760]: I0930 09:16:32.808487 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7e7f8143-b3b0-473b-b058-1c2fd9eaa5ac/rabbitmq/0.log" Sep 30 09:16:32 crc kubenswrapper[4760]: I0930 09:16:32.996146 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-6xmjs_9efadf79-7f8c-4a83-9788-6f4f0a5ecd77/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:16:33 crc kubenswrapper[4760]: I0930 09:16:33.080457 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-wld9k_36b213a9-6e12-4215-be85-b1a0c647558f/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:16:33 crc kubenswrapper[4760]: I0930 09:16:33.249738 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-hl6sj_65500975-80f6-4dae-a528-33950d370831/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:16:33 crc kubenswrapper[4760]: I0930 09:16:33.441083 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-vznk5_820e5332-bfcf-4cca-8079-e3d26cc62517/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:16:33 crc kubenswrapper[4760]: I0930 09:16:33.574620 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-m7lks_00487f96-583f-4ae8-bd0d-7fb932d86feb/ssh-known-hosts-edpm-deployment/0.log" Sep 30 09:16:33 crc kubenswrapper[4760]: I0930 09:16:33.814449 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6cc97c56c5-7pkjn_fc788440-e748-4b41-bdb6-23a6764062fd/proxy-server/0.log" Sep 30 09:16:34 crc kubenswrapper[4760]: I0930 09:16:34.020184 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6cc97c56c5-7pkjn_fc788440-e748-4b41-bdb6-23a6764062fd/proxy-httpd/0.log" Sep 30 09:16:34 crc kubenswrapper[4760]: I0930 09:16:34.030054 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-4rxf2_cfaf0e86-3b68-4e5c-8caf-c60518a28016/swift-ring-rebalance/0.log" Sep 30 09:16:34 crc kubenswrapper[4760]: I0930 09:16:34.238639 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/account-reaper/0.log" Sep 30 09:16:34 crc kubenswrapper[4760]: I0930 09:16:34.279627 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/account-auditor/0.log" Sep 30 09:16:34 crc kubenswrapper[4760]: I0930 09:16:34.403443 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/account-replicator/0.log" Sep 30 09:16:34 crc kubenswrapper[4760]: I0930 09:16:34.465803 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/account-server/0.log" Sep 30 09:16:34 crc kubenswrapper[4760]: I0930 09:16:34.496471 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/container-auditor/0.log" Sep 30 09:16:34 crc kubenswrapper[4760]: I0930 09:16:34.664366 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/container-replicator/0.log" Sep 30 09:16:34 crc kubenswrapper[4760]: I0930 09:16:34.718059 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/container-updater/0.log" Sep 30 09:16:34 crc kubenswrapper[4760]: I0930 09:16:34.736952 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/container-server/0.log" Sep 30 09:16:34 crc kubenswrapper[4760]: I0930 09:16:34.939953 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/object-expirer/0.log" Sep 30 09:16:34 crc kubenswrapper[4760]: I0930 09:16:34.952961 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/object-auditor/0.log" Sep 30 09:16:34 crc kubenswrapper[4760]: I0930 09:16:34.979888 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/object-replicator/0.log" Sep 30 09:16:35 crc kubenswrapper[4760]: I0930 09:16:35.143540 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/object-server/0.log" Sep 30 09:16:35 crc kubenswrapper[4760]: I0930 09:16:35.171450 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/rsync/0.log" Sep 30 09:16:35 crc kubenswrapper[4760]: I0930 09:16:35.193647 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/object-updater/0.log" Sep 30 09:16:35 crc kubenswrapper[4760]: I0930 09:16:35.330830 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4f0b34-3c4a-4c78-b284-5959e91b00c0/swift-recon-cron/0.log" Sep 30 09:16:35 crc kubenswrapper[4760]: I0930 09:16:35.435656 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-qpf2g_9d944c3c-b8ab-4a31-a8c6-aa086a0d02fd/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:16:35 crc kubenswrapper[4760]: I0930 09:16:35.598444 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_394b8542-fe18-475f-9374-ce5c7e3820e7/tempest-tests-tempest-tests-runner/0.log" Sep 30 09:16:35 crc kubenswrapper[4760]: I0930 09:16:35.856745 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_bb327f16-8c82-4829-b400-a5917094069f/test-operator-logs-container/0.log" Sep 30 09:16:35 crc kubenswrapper[4760]: I0930 09:16:35.926085 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-ptr7r_daecc10f-5930-44cc-806b-95012b47df8a/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 09:16:36 crc kubenswrapper[4760]: I0930 09:16:36.977489 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_7c12cbe5-a6fb-4ead-bb65-cd13dab410ce/watcher-applier/0.log" Sep 30 09:16:37 crc kubenswrapper[4760]: I0930 09:16:37.093883 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_79c24341-f615-4dcf-818f-e1c398e2504d/watcher-api-log/0.log" Sep 30 09:16:38 crc kubenswrapper[4760]: I0930 09:16:38.594549 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_bd8762e1-d4b3-4999-996e-db79b881afec/watcher-decision-engine/0.log" Sep 30 09:16:40 crc kubenswrapper[4760]: I0930 09:16:40.967177 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_79c24341-f615-4dcf-818f-e1c398e2504d/watcher-api/0.log" Sep 30 09:16:43 crc kubenswrapper[4760]: I0930 09:16:43.182020 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_598c1476-b9fa-48c1-a346-80e23448d00f/memcached/0.log" Sep 30 09:16:48 crc kubenswrapper[4760]: I0930 09:16:48.370000 4760 scope.go:117] "RemoveContainer" containerID="de32625d893599ea58e25f401caafe89e732c3ca86bcbe693edd4cce6be6572e" Sep 30 09:16:49 crc kubenswrapper[4760]: I0930 09:16:49.113488 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 09:16:49 crc kubenswrapper[4760]: I0930 09:16:49.114260 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 09:16:49 crc kubenswrapper[4760]: I0930 09:16:49.114452 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 09:16:49 crc kubenswrapper[4760]: I0930 09:16:49.115339 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4a9ddef5425fff579d2f6273dac3ac3872c271566f3c6e2d32ae0cd93a2ffc46"} pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 09:16:49 crc kubenswrapper[4760]: I0930 09:16:49.115489 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" containerID="cri-o://4a9ddef5425fff579d2f6273dac3ac3872c271566f3c6e2d32ae0cd93a2ffc46" gracePeriod=600 Sep 30 09:16:49 crc kubenswrapper[4760]: I0930 09:16:49.833792 4760 generic.go:334] "Generic (PLEG): container finished" podID="7a9c8270-6964-4886-87d0-227b05b76da4" containerID="4a9ddef5425fff579d2f6273dac3ac3872c271566f3c6e2d32ae0cd93a2ffc46" exitCode=0 Sep 30 09:16:49 crc kubenswrapper[4760]: I0930 09:16:49.834009 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerDied","Data":"4a9ddef5425fff579d2f6273dac3ac3872c271566f3c6e2d32ae0cd93a2ffc46"} Sep 30 09:16:49 crc kubenswrapper[4760]: I0930 09:16:49.834091 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerStarted","Data":"149756a888246643c052c2b04ebbc33cdfe99da6c87001e418b7e5ba856a4272"} Sep 30 09:16:49 crc kubenswrapper[4760]: I0930 09:16:49.834116 4760 scope.go:117] "RemoveContainer" containerID="505728dcf5cec3fc994c058239c0150ac7db870b33dd3a6ddbc1c1699cc9b079" Sep 30 09:17:00 crc kubenswrapper[4760]: I0930 09:17:00.944577 4760 generic.go:334] "Generic (PLEG): container finished" podID="81124952-71e6-4e83-9f31-423ee4cde133" containerID="cd48d0d93161a5653ba46822c329ddef4de9a7347e71432ce72258e2c257992e" exitCode=0 Sep 30 09:17:00 crc kubenswrapper[4760]: I0930 09:17:00.944841 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8vlhv/crc-debug-s94t5" event={"ID":"81124952-71e6-4e83-9f31-423ee4cde133","Type":"ContainerDied","Data":"cd48d0d93161a5653ba46822c329ddef4de9a7347e71432ce72258e2c257992e"} Sep 30 09:17:02 crc kubenswrapper[4760]: I0930 09:17:02.086639 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8vlhv/crc-debug-s94t5" Sep 30 09:17:02 crc kubenswrapper[4760]: I0930 09:17:02.118557 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8vlhv/crc-debug-s94t5"] Sep 30 09:17:02 crc kubenswrapper[4760]: I0930 09:17:02.126329 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8vlhv/crc-debug-s94t5"] Sep 30 09:17:02 crc kubenswrapper[4760]: I0930 09:17:02.174474 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6bnb\" (UniqueName: \"kubernetes.io/projected/81124952-71e6-4e83-9f31-423ee4cde133-kube-api-access-s6bnb\") pod \"81124952-71e6-4e83-9f31-423ee4cde133\" (UID: \"81124952-71e6-4e83-9f31-423ee4cde133\") " Sep 30 09:17:02 crc kubenswrapper[4760]: I0930 09:17:02.174701 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81124952-71e6-4e83-9f31-423ee4cde133-host\") pod \"81124952-71e6-4e83-9f31-423ee4cde133\" (UID: \"81124952-71e6-4e83-9f31-423ee4cde133\") " Sep 30 09:17:02 crc kubenswrapper[4760]: I0930 09:17:02.174745 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/81124952-71e6-4e83-9f31-423ee4cde133-host" (OuterVolumeSpecName: "host") pod "81124952-71e6-4e83-9f31-423ee4cde133" (UID: "81124952-71e6-4e83-9f31-423ee4cde133"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 09:17:02 crc kubenswrapper[4760]: I0930 09:17:02.175227 4760 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81124952-71e6-4e83-9f31-423ee4cde133-host\") on node \"crc\" DevicePath \"\"" Sep 30 09:17:02 crc kubenswrapper[4760]: I0930 09:17:02.180590 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81124952-71e6-4e83-9f31-423ee4cde133-kube-api-access-s6bnb" (OuterVolumeSpecName: "kube-api-access-s6bnb") pod "81124952-71e6-4e83-9f31-423ee4cde133" (UID: "81124952-71e6-4e83-9f31-423ee4cde133"). InnerVolumeSpecName "kube-api-access-s6bnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 09:17:02 crc kubenswrapper[4760]: I0930 09:17:02.277005 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6bnb\" (UniqueName: \"kubernetes.io/projected/81124952-71e6-4e83-9f31-423ee4cde133-kube-api-access-s6bnb\") on node \"crc\" DevicePath \"\"" Sep 30 09:17:02 crc kubenswrapper[4760]: I0930 09:17:02.972687 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="290b4dbbce7156e6958a963dc0e49a5fabdfb946342497a7228f59fe2fd508f3" Sep 30 09:17:02 crc kubenswrapper[4760]: I0930 09:17:02.973025 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8vlhv/crc-debug-s94t5" Sep 30 09:17:03 crc kubenswrapper[4760]: I0930 09:17:03.079146 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81124952-71e6-4e83-9f31-423ee4cde133" path="/var/lib/kubelet/pods/81124952-71e6-4e83-9f31-423ee4cde133/volumes" Sep 30 09:17:03 crc kubenswrapper[4760]: I0930 09:17:03.319910 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8vlhv/crc-debug-g7jrm"] Sep 30 09:17:03 crc kubenswrapper[4760]: E0930 09:17:03.320334 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10799bad-eac6-4db4-86e0-377983dd439d" containerName="collect-profiles" Sep 30 09:17:03 crc kubenswrapper[4760]: I0930 09:17:03.320347 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="10799bad-eac6-4db4-86e0-377983dd439d" containerName="collect-profiles" Sep 30 09:17:03 crc kubenswrapper[4760]: E0930 09:17:03.320384 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81124952-71e6-4e83-9f31-423ee4cde133" containerName="container-00" Sep 30 09:17:03 crc kubenswrapper[4760]: I0930 09:17:03.320389 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="81124952-71e6-4e83-9f31-423ee4cde133" containerName="container-00" Sep 30 09:17:03 crc kubenswrapper[4760]: I0930 09:17:03.320571 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="10799bad-eac6-4db4-86e0-377983dd439d" containerName="collect-profiles" Sep 30 09:17:03 crc kubenswrapper[4760]: I0930 09:17:03.320592 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="81124952-71e6-4e83-9f31-423ee4cde133" containerName="container-00" Sep 30 09:17:03 crc kubenswrapper[4760]: I0930 09:17:03.321421 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8vlhv/crc-debug-g7jrm" Sep 30 09:17:03 crc kubenswrapper[4760]: I0930 09:17:03.499855 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/74ba0c37-2d28-4cd5-b35c-9e5cab28aba3-host\") pod \"crc-debug-g7jrm\" (UID: \"74ba0c37-2d28-4cd5-b35c-9e5cab28aba3\") " pod="openshift-must-gather-8vlhv/crc-debug-g7jrm" Sep 30 09:17:03 crc kubenswrapper[4760]: I0930 09:17:03.500162 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlvs5\" (UniqueName: \"kubernetes.io/projected/74ba0c37-2d28-4cd5-b35c-9e5cab28aba3-kube-api-access-jlvs5\") pod \"crc-debug-g7jrm\" (UID: \"74ba0c37-2d28-4cd5-b35c-9e5cab28aba3\") " pod="openshift-must-gather-8vlhv/crc-debug-g7jrm" Sep 30 09:17:03 crc kubenswrapper[4760]: I0930 09:17:03.603466 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlvs5\" (UniqueName: \"kubernetes.io/projected/74ba0c37-2d28-4cd5-b35c-9e5cab28aba3-kube-api-access-jlvs5\") pod \"crc-debug-g7jrm\" (UID: \"74ba0c37-2d28-4cd5-b35c-9e5cab28aba3\") " pod="openshift-must-gather-8vlhv/crc-debug-g7jrm" Sep 30 09:17:03 crc kubenswrapper[4760]: I0930 09:17:03.603719 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/74ba0c37-2d28-4cd5-b35c-9e5cab28aba3-host\") pod \"crc-debug-g7jrm\" (UID: \"74ba0c37-2d28-4cd5-b35c-9e5cab28aba3\") " pod="openshift-must-gather-8vlhv/crc-debug-g7jrm" Sep 30 09:17:03 crc kubenswrapper[4760]: I0930 09:17:03.604057 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/74ba0c37-2d28-4cd5-b35c-9e5cab28aba3-host\") pod \"crc-debug-g7jrm\" (UID: \"74ba0c37-2d28-4cd5-b35c-9e5cab28aba3\") " pod="openshift-must-gather-8vlhv/crc-debug-g7jrm" Sep 30 09:17:03 crc kubenswrapper[4760]: I0930 09:17:03.624446 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlvs5\" (UniqueName: \"kubernetes.io/projected/74ba0c37-2d28-4cd5-b35c-9e5cab28aba3-kube-api-access-jlvs5\") pod \"crc-debug-g7jrm\" (UID: \"74ba0c37-2d28-4cd5-b35c-9e5cab28aba3\") " pod="openshift-must-gather-8vlhv/crc-debug-g7jrm" Sep 30 09:17:03 crc kubenswrapper[4760]: I0930 09:17:03.644140 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8vlhv/crc-debug-g7jrm" Sep 30 09:17:03 crc kubenswrapper[4760]: W0930 09:17:03.675463 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74ba0c37_2d28_4cd5_b35c_9e5cab28aba3.slice/crio-72479c2905c1d3298bc38bd548ada7d4eaafbcee96a89f9d2c23fde21f44a1e5 WatchSource:0}: Error finding container 72479c2905c1d3298bc38bd548ada7d4eaafbcee96a89f9d2c23fde21f44a1e5: Status 404 returned error can't find the container with id 72479c2905c1d3298bc38bd548ada7d4eaafbcee96a89f9d2c23fde21f44a1e5 Sep 30 09:17:03 crc kubenswrapper[4760]: I0930 09:17:03.988528 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8vlhv/crc-debug-g7jrm" event={"ID":"74ba0c37-2d28-4cd5-b35c-9e5cab28aba3","Type":"ContainerStarted","Data":"6ed1c7f02d2f92a96b93d2fca1139367d337ffbee9185d71b74e77b99825fcb4"} Sep 30 09:17:03 crc kubenswrapper[4760]: I0930 09:17:03.988998 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8vlhv/crc-debug-g7jrm" event={"ID":"74ba0c37-2d28-4cd5-b35c-9e5cab28aba3","Type":"ContainerStarted","Data":"72479c2905c1d3298bc38bd548ada7d4eaafbcee96a89f9d2c23fde21f44a1e5"} Sep 30 09:17:04 crc kubenswrapper[4760]: I0930 09:17:04.009395 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8vlhv/crc-debug-g7jrm" podStartSLOduration=1.009362748 podStartE2EDuration="1.009362748s" podCreationTimestamp="2025-09-30 09:17:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 09:17:04.007905241 +0000 UTC m=+6209.650811653" watchObservedRunningTime="2025-09-30 09:17:04.009362748 +0000 UTC m=+6209.652269220" Sep 30 09:17:05 crc kubenswrapper[4760]: I0930 09:17:05.001826 4760 generic.go:334] "Generic (PLEG): container finished" podID="74ba0c37-2d28-4cd5-b35c-9e5cab28aba3" containerID="6ed1c7f02d2f92a96b93d2fca1139367d337ffbee9185d71b74e77b99825fcb4" exitCode=0 Sep 30 09:17:05 crc kubenswrapper[4760]: I0930 09:17:05.002260 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8vlhv/crc-debug-g7jrm" event={"ID":"74ba0c37-2d28-4cd5-b35c-9e5cab28aba3","Type":"ContainerDied","Data":"6ed1c7f02d2f92a96b93d2fca1139367d337ffbee9185d71b74e77b99825fcb4"} Sep 30 09:17:06 crc kubenswrapper[4760]: I0930 09:17:06.138028 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8vlhv/crc-debug-g7jrm" Sep 30 09:17:06 crc kubenswrapper[4760]: I0930 09:17:06.244569 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlvs5\" (UniqueName: \"kubernetes.io/projected/74ba0c37-2d28-4cd5-b35c-9e5cab28aba3-kube-api-access-jlvs5\") pod \"74ba0c37-2d28-4cd5-b35c-9e5cab28aba3\" (UID: \"74ba0c37-2d28-4cd5-b35c-9e5cab28aba3\") " Sep 30 09:17:06 crc kubenswrapper[4760]: I0930 09:17:06.244989 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/74ba0c37-2d28-4cd5-b35c-9e5cab28aba3-host\") pod \"74ba0c37-2d28-4cd5-b35c-9e5cab28aba3\" (UID: \"74ba0c37-2d28-4cd5-b35c-9e5cab28aba3\") " Sep 30 09:17:06 crc kubenswrapper[4760]: I0930 09:17:06.245170 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74ba0c37-2d28-4cd5-b35c-9e5cab28aba3-host" (OuterVolumeSpecName: "host") pod "74ba0c37-2d28-4cd5-b35c-9e5cab28aba3" (UID: "74ba0c37-2d28-4cd5-b35c-9e5cab28aba3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 09:17:06 crc kubenswrapper[4760]: I0930 09:17:06.247893 4760 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/74ba0c37-2d28-4cd5-b35c-9e5cab28aba3-host\") on node \"crc\" DevicePath \"\"" Sep 30 09:17:06 crc kubenswrapper[4760]: I0930 09:17:06.250432 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74ba0c37-2d28-4cd5-b35c-9e5cab28aba3-kube-api-access-jlvs5" (OuterVolumeSpecName: "kube-api-access-jlvs5") pod "74ba0c37-2d28-4cd5-b35c-9e5cab28aba3" (UID: "74ba0c37-2d28-4cd5-b35c-9e5cab28aba3"). InnerVolumeSpecName "kube-api-access-jlvs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 09:17:06 crc kubenswrapper[4760]: I0930 09:17:06.349347 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlvs5\" (UniqueName: \"kubernetes.io/projected/74ba0c37-2d28-4cd5-b35c-9e5cab28aba3-kube-api-access-jlvs5\") on node \"crc\" DevicePath \"\"" Sep 30 09:17:07 crc kubenswrapper[4760]: I0930 09:17:07.031223 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8vlhv/crc-debug-g7jrm" event={"ID":"74ba0c37-2d28-4cd5-b35c-9e5cab28aba3","Type":"ContainerDied","Data":"72479c2905c1d3298bc38bd548ada7d4eaafbcee96a89f9d2c23fde21f44a1e5"} Sep 30 09:17:07 crc kubenswrapper[4760]: I0930 09:17:07.031268 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72479c2905c1d3298bc38bd548ada7d4eaafbcee96a89f9d2c23fde21f44a1e5" Sep 30 09:17:07 crc kubenswrapper[4760]: I0930 09:17:07.031354 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8vlhv/crc-debug-g7jrm" Sep 30 09:17:13 crc kubenswrapper[4760]: I0930 09:17:13.572748 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8vlhv/crc-debug-g7jrm"] Sep 30 09:17:13 crc kubenswrapper[4760]: I0930 09:17:13.580693 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8vlhv/crc-debug-g7jrm"] Sep 30 09:17:14 crc kubenswrapper[4760]: I0930 09:17:14.765431 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8vlhv/crc-debug-hv4ss"] Sep 30 09:17:14 crc kubenswrapper[4760]: E0930 09:17:14.766204 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74ba0c37-2d28-4cd5-b35c-9e5cab28aba3" containerName="container-00" Sep 30 09:17:14 crc kubenswrapper[4760]: I0930 09:17:14.766219 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="74ba0c37-2d28-4cd5-b35c-9e5cab28aba3" containerName="container-00" Sep 30 09:17:14 crc kubenswrapper[4760]: I0930 09:17:14.766500 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="74ba0c37-2d28-4cd5-b35c-9e5cab28aba3" containerName="container-00" Sep 30 09:17:14 crc kubenswrapper[4760]: I0930 09:17:14.767354 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8vlhv/crc-debug-hv4ss" Sep 30 09:17:14 crc kubenswrapper[4760]: I0930 09:17:14.833739 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7qvr\" (UniqueName: \"kubernetes.io/projected/99def55d-dfeb-4eea-ab3c-e2fb4ce88346-kube-api-access-x7qvr\") pod \"crc-debug-hv4ss\" (UID: \"99def55d-dfeb-4eea-ab3c-e2fb4ce88346\") " pod="openshift-must-gather-8vlhv/crc-debug-hv4ss" Sep 30 09:17:14 crc kubenswrapper[4760]: I0930 09:17:14.833860 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99def55d-dfeb-4eea-ab3c-e2fb4ce88346-host\") pod \"crc-debug-hv4ss\" (UID: \"99def55d-dfeb-4eea-ab3c-e2fb4ce88346\") " pod="openshift-must-gather-8vlhv/crc-debug-hv4ss" Sep 30 09:17:14 crc kubenswrapper[4760]: I0930 09:17:14.935809 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99def55d-dfeb-4eea-ab3c-e2fb4ce88346-host\") pod \"crc-debug-hv4ss\" (UID: \"99def55d-dfeb-4eea-ab3c-e2fb4ce88346\") " pod="openshift-must-gather-8vlhv/crc-debug-hv4ss" Sep 30 09:17:14 crc kubenswrapper[4760]: I0930 09:17:14.935968 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99def55d-dfeb-4eea-ab3c-e2fb4ce88346-host\") pod \"crc-debug-hv4ss\" (UID: \"99def55d-dfeb-4eea-ab3c-e2fb4ce88346\") " pod="openshift-must-gather-8vlhv/crc-debug-hv4ss" Sep 30 09:17:14 crc kubenswrapper[4760]: I0930 09:17:14.936006 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7qvr\" (UniqueName: \"kubernetes.io/projected/99def55d-dfeb-4eea-ab3c-e2fb4ce88346-kube-api-access-x7qvr\") pod \"crc-debug-hv4ss\" (UID: \"99def55d-dfeb-4eea-ab3c-e2fb4ce88346\") " pod="openshift-must-gather-8vlhv/crc-debug-hv4ss" Sep 30 09:17:14 crc kubenswrapper[4760]: I0930 09:17:14.955159 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7qvr\" (UniqueName: \"kubernetes.io/projected/99def55d-dfeb-4eea-ab3c-e2fb4ce88346-kube-api-access-x7qvr\") pod \"crc-debug-hv4ss\" (UID: \"99def55d-dfeb-4eea-ab3c-e2fb4ce88346\") " pod="openshift-must-gather-8vlhv/crc-debug-hv4ss" Sep 30 09:17:15 crc kubenswrapper[4760]: I0930 09:17:15.080962 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74ba0c37-2d28-4cd5-b35c-9e5cab28aba3" path="/var/lib/kubelet/pods/74ba0c37-2d28-4cd5-b35c-9e5cab28aba3/volumes" Sep 30 09:17:15 crc kubenswrapper[4760]: I0930 09:17:15.094190 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8vlhv/crc-debug-hv4ss" Sep 30 09:17:16 crc kubenswrapper[4760]: I0930 09:17:16.112199 4760 generic.go:334] "Generic (PLEG): container finished" podID="99def55d-dfeb-4eea-ab3c-e2fb4ce88346" containerID="40128ef38c6517d6d8007edc46a8b3e20a2950a85b4c663a41a35aeeebf8d2ae" exitCode=0 Sep 30 09:17:16 crc kubenswrapper[4760]: I0930 09:17:16.112358 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8vlhv/crc-debug-hv4ss" event={"ID":"99def55d-dfeb-4eea-ab3c-e2fb4ce88346","Type":"ContainerDied","Data":"40128ef38c6517d6d8007edc46a8b3e20a2950a85b4c663a41a35aeeebf8d2ae"} Sep 30 09:17:16 crc kubenswrapper[4760]: I0930 09:17:16.112614 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8vlhv/crc-debug-hv4ss" event={"ID":"99def55d-dfeb-4eea-ab3c-e2fb4ce88346","Type":"ContainerStarted","Data":"3d109a6d1662bcf8c128c11baac20876746b2e90464a9f8b53a4a1a9d489f0e7"} Sep 30 09:17:16 crc kubenswrapper[4760]: I0930 09:17:16.157079 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8vlhv/crc-debug-hv4ss"] Sep 30 09:17:16 crc kubenswrapper[4760]: I0930 09:17:16.171209 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8vlhv/crc-debug-hv4ss"] Sep 30 09:17:17 crc kubenswrapper[4760]: I0930 09:17:17.227582 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8vlhv/crc-debug-hv4ss" Sep 30 09:17:17 crc kubenswrapper[4760]: I0930 09:17:17.282264 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99def55d-dfeb-4eea-ab3c-e2fb4ce88346-host\") pod \"99def55d-dfeb-4eea-ab3c-e2fb4ce88346\" (UID: \"99def55d-dfeb-4eea-ab3c-e2fb4ce88346\") " Sep 30 09:17:17 crc kubenswrapper[4760]: I0930 09:17:17.282373 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99def55d-dfeb-4eea-ab3c-e2fb4ce88346-host" (OuterVolumeSpecName: "host") pod "99def55d-dfeb-4eea-ab3c-e2fb4ce88346" (UID: "99def55d-dfeb-4eea-ab3c-e2fb4ce88346"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 09:17:17 crc kubenswrapper[4760]: I0930 09:17:17.282409 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7qvr\" (UniqueName: \"kubernetes.io/projected/99def55d-dfeb-4eea-ab3c-e2fb4ce88346-kube-api-access-x7qvr\") pod \"99def55d-dfeb-4eea-ab3c-e2fb4ce88346\" (UID: \"99def55d-dfeb-4eea-ab3c-e2fb4ce88346\") " Sep 30 09:17:17 crc kubenswrapper[4760]: I0930 09:17:17.282995 4760 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99def55d-dfeb-4eea-ab3c-e2fb4ce88346-host\") on node \"crc\" DevicePath \"\"" Sep 30 09:17:17 crc kubenswrapper[4760]: I0930 09:17:17.291560 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99def55d-dfeb-4eea-ab3c-e2fb4ce88346-kube-api-access-x7qvr" (OuterVolumeSpecName: "kube-api-access-x7qvr") pod "99def55d-dfeb-4eea-ab3c-e2fb4ce88346" (UID: "99def55d-dfeb-4eea-ab3c-e2fb4ce88346"). InnerVolumeSpecName "kube-api-access-x7qvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 09:17:17 crc kubenswrapper[4760]: I0930 09:17:17.385204 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7qvr\" (UniqueName: \"kubernetes.io/projected/99def55d-dfeb-4eea-ab3c-e2fb4ce88346-kube-api-access-x7qvr\") on node \"crc\" DevicePath \"\"" Sep 30 09:17:17 crc kubenswrapper[4760]: I0930 09:17:17.783232 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh_59175508-6983-4846-a813-05181244346d/util/0.log" Sep 30 09:17:17 crc kubenswrapper[4760]: I0930 09:17:17.945748 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh_59175508-6983-4846-a813-05181244346d/util/0.log" Sep 30 09:17:17 crc kubenswrapper[4760]: I0930 09:17:17.952310 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh_59175508-6983-4846-a813-05181244346d/pull/0.log" Sep 30 09:17:17 crc kubenswrapper[4760]: I0930 09:17:17.983601 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh_59175508-6983-4846-a813-05181244346d/pull/0.log" Sep 30 09:17:18 crc kubenswrapper[4760]: I0930 09:17:18.129616 4760 scope.go:117] "RemoveContainer" containerID="40128ef38c6517d6d8007edc46a8b3e20a2950a85b4c663a41a35aeeebf8d2ae" Sep 30 09:17:18 crc kubenswrapper[4760]: I0930 09:17:18.129643 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8vlhv/crc-debug-hv4ss" Sep 30 09:17:18 crc kubenswrapper[4760]: I0930 09:17:18.159673 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh_59175508-6983-4846-a813-05181244346d/pull/0.log" Sep 30 09:17:18 crc kubenswrapper[4760]: I0930 09:17:18.178087 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh_59175508-6983-4846-a813-05181244346d/util/0.log" Sep 30 09:17:18 crc kubenswrapper[4760]: I0930 09:17:18.202697 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2be1a55d13f98b28512c5bf715627f2108773b920b5e68da320e70991fjxzrh_59175508-6983-4846-a813-05181244346d/extract/0.log" Sep 30 09:17:18 crc kubenswrapper[4760]: I0930 09:17:18.327000 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-n528w_90fe11d3-6b6b-46c3-9833-d68d080144b9/kube-rbac-proxy/0.log" Sep 30 09:17:18 crc kubenswrapper[4760]: I0930 09:17:18.392887 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-n528w_90fe11d3-6b6b-46c3-9833-d68d080144b9/manager/0.log" Sep 30 09:17:18 crc kubenswrapper[4760]: I0930 09:17:18.430436 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-s4vf9_7fdb76d3-726a-416a-9b64-df2d6a67d88a/kube-rbac-proxy/0.log" Sep 30 09:17:18 crc kubenswrapper[4760]: I0930 09:17:18.541536 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-s4vf9_7fdb76d3-726a-416a-9b64-df2d6a67d88a/manager/0.log" Sep 30 09:17:18 crc kubenswrapper[4760]: I0930 09:17:18.603502 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-2zwsf_0673130a-0175-41b4-a8d8-188c7a39caa0/kube-rbac-proxy/0.log" Sep 30 09:17:18 crc kubenswrapper[4760]: I0930 09:17:18.688333 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-2zwsf_0673130a-0175-41b4-a8d8-188c7a39caa0/manager/0.log" Sep 30 09:17:18 crc kubenswrapper[4760]: I0930 09:17:18.772151 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-mcrx5_93eb25ad-5a9d-4044-ba79-8869b28787dd/kube-rbac-proxy/0.log" Sep 30 09:17:18 crc kubenswrapper[4760]: I0930 09:17:18.884656 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-mcrx5_93eb25ad-5a9d-4044-ba79-8869b28787dd/manager/0.log" Sep 30 09:17:18 crc kubenswrapper[4760]: I0930 09:17:18.956709 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-7brm5_fff7432b-8ea3-4b35-8726-640f02bd8d58/kube-rbac-proxy/0.log" Sep 30 09:17:19 crc kubenswrapper[4760]: I0930 09:17:19.000282 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-7brm5_fff7432b-8ea3-4b35-8726-640f02bd8d58/manager/0.log" Sep 30 09:17:19 crc kubenswrapper[4760]: I0930 09:17:19.078555 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99def55d-dfeb-4eea-ab3c-e2fb4ce88346" path="/var/lib/kubelet/pods/99def55d-dfeb-4eea-ab3c-e2fb4ce88346/volumes" Sep 30 09:17:19 crc kubenswrapper[4760]: I0930 09:17:19.127520 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-j5vp7_28a5f605-2c82-4747-8b3d-2704804e81ec/kube-rbac-proxy/0.log" Sep 30 09:17:19 crc kubenswrapper[4760]: I0930 09:17:19.212445 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-j5vp7_28a5f605-2c82-4747-8b3d-2704804e81ec/manager/0.log" Sep 30 09:17:19 crc kubenswrapper[4760]: I0930 09:17:19.331571 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-nqdfs_18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b/kube-rbac-proxy/0.log" Sep 30 09:17:19 crc kubenswrapper[4760]: I0930 09:17:19.457092 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-7d78g_626c03a1-0630-42af-a1c4-af6e2c3584a5/kube-rbac-proxy/0.log" Sep 30 09:17:19 crc kubenswrapper[4760]: I0930 09:17:19.522640 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-nqdfs_18445f5d-fbcd-4bdb-9b9b-ed15e8a6b75b/manager/0.log" Sep 30 09:17:19 crc kubenswrapper[4760]: I0930 09:17:19.577810 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-7d78g_626c03a1-0630-42af-a1c4-af6e2c3584a5/manager/0.log" Sep 30 09:17:19 crc kubenswrapper[4760]: I0930 09:17:19.649505 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-rvrn9_27eb71ec-2145-426e-86fd-f31166b969e8/kube-rbac-proxy/0.log" Sep 30 09:17:19 crc kubenswrapper[4760]: I0930 09:17:19.790098 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-rvrn9_27eb71ec-2145-426e-86fd-f31166b969e8/manager/0.log" Sep 30 09:17:19 crc kubenswrapper[4760]: I0930 09:17:19.799096 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-5k4x2_26e8229c-cd7b-4eab-a36c-e94d5a367224/kube-rbac-proxy/0.log" Sep 30 09:17:19 crc kubenswrapper[4760]: I0930 09:17:19.854188 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-5k4x2_26e8229c-cd7b-4eab-a36c-e94d5a367224/manager/0.log" Sep 30 09:17:19 crc kubenswrapper[4760]: I0930 09:17:19.985387 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-69mqs_c4bab529-6936-4f18-b4c9-4d8202e1cf6a/manager/0.log" Sep 30 09:17:20 crc kubenswrapper[4760]: I0930 09:17:20.002421 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-69mqs_c4bab529-6936-4f18-b4c9-4d8202e1cf6a/kube-rbac-proxy/0.log" Sep 30 09:17:20 crc kubenswrapper[4760]: I0930 09:17:20.209692 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-gnfhj_070b883a-da84-454e-a2d3-cc43fbf5251a/kube-rbac-proxy/0.log" Sep 30 09:17:20 crc kubenswrapper[4760]: I0930 09:17:20.216638 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-gnfhj_070b883a-da84-454e-a2d3-cc43fbf5251a/manager/0.log" Sep 30 09:17:20 crc kubenswrapper[4760]: I0930 09:17:20.258585 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-95qms_c0be2186-ebe8-4634-942e-fcf6f5c0fdf6/kube-rbac-proxy/0.log" Sep 30 09:17:20 crc kubenswrapper[4760]: I0930 09:17:20.479241 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-95qms_c0be2186-ebe8-4634-942e-fcf6f5c0fdf6/manager/0.log" Sep 30 09:17:20 crc kubenswrapper[4760]: I0930 09:17:20.493251 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-p2qg2_be15e869-eae3-4164-a9b3-ba2d16238186/kube-rbac-proxy/0.log" Sep 30 09:17:20 crc kubenswrapper[4760]: I0930 09:17:20.503628 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-p2qg2_be15e869-eae3-4164-a9b3-ba2d16238186/manager/0.log" Sep 30 09:17:20 crc kubenswrapper[4760]: I0930 09:17:20.650445 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-qdj9t_7fac6c59-9344-46b8-b4ce-30b80c6a8b53/manager/0.log" Sep 30 09:17:20 crc kubenswrapper[4760]: I0930 09:17:20.661657 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-qdj9t_7fac6c59-9344-46b8-b4ce-30b80c6a8b53/kube-rbac-proxy/0.log" Sep 30 09:17:20 crc kubenswrapper[4760]: I0930 09:17:20.803075 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-df6bd9948-rjq2r_fe420b73-f7ff-40e5-8b63-475e61942e3d/kube-rbac-proxy/0.log" Sep 30 09:17:20 crc kubenswrapper[4760]: I0930 09:17:20.911655 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-799b749c5f-bxgqk_ecfaa27d-a6ba-432f-8a63-80706fcdf76a/kube-rbac-proxy/0.log" Sep 30 09:17:21 crc kubenswrapper[4760]: I0930 09:17:21.111629 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-hpsst_c076d5bf-bc69-4e76-b891-4d5c4387d68c/registry-server/0.log" Sep 30 09:17:21 crc kubenswrapper[4760]: I0930 09:17:21.144189 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-799b749c5f-bxgqk_ecfaa27d-a6ba-432f-8a63-80706fcdf76a/operator/0.log" Sep 30 09:17:21 crc kubenswrapper[4760]: I0930 09:17:21.353289 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-64gg8_8bc774be-38eb-4c0a-9c02-fb39c645cc28/kube-rbac-proxy/0.log" Sep 30 09:17:21 crc kubenswrapper[4760]: I0930 09:17:21.368340 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-64gg8_8bc774be-38eb-4c0a-9c02-fb39c645cc28/manager/0.log" Sep 30 09:17:21 crc kubenswrapper[4760]: I0930 09:17:21.531620 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-mzwdq_000713c9-22e2-4251-b81d-e1d47a48184e/kube-rbac-proxy/0.log" Sep 30 09:17:21 crc kubenswrapper[4760]: I0930 09:17:21.584424 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-mzwdq_000713c9-22e2-4251-b81d-e1d47a48184e/manager/0.log" Sep 30 09:17:21 crc kubenswrapper[4760]: I0930 09:17:21.615970 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-79d8469568-whhn2_f7777c80-60ad-47c2-a76a-002f99b89d61/operator/0.log" Sep 30 09:17:21 crc kubenswrapper[4760]: I0930 09:17:21.843775 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-kbsdn_7a562d30-ce00-4dca-9792-6687cf729825/manager/0.log" Sep 30 09:17:21 crc kubenswrapper[4760]: I0930 09:17:21.848669 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-kbsdn_7a562d30-ce00-4dca-9792-6687cf729825/kube-rbac-proxy/0.log" Sep 30 09:17:22 crc kubenswrapper[4760]: I0930 09:17:22.108868 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-l4g6m_cd75a50f-b3a1-4bef-ac18-e574ef6815ec/kube-rbac-proxy/0.log" Sep 30 09:17:22 crc kubenswrapper[4760]: I0930 09:17:22.126633 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-wwngp_31832467-ab15-475b-a71b-7263e64cdff9/kube-rbac-proxy/0.log" Sep 30 09:17:22 crc kubenswrapper[4760]: I0930 09:17:22.223236 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-df6bd9948-rjq2r_fe420b73-f7ff-40e5-8b63-475e61942e3d/manager/0.log" Sep 30 09:17:22 crc kubenswrapper[4760]: I0930 09:17:22.301174 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-wwngp_31832467-ab15-475b-a71b-7263e64cdff9/manager/0.log" Sep 30 09:17:22 crc kubenswrapper[4760]: I0930 09:17:22.337234 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c459b467f-xxhnp_08bd2560-a223-4d1d-abf6-cf3686f1ded2/kube-rbac-proxy/0.log" Sep 30 09:17:22 crc kubenswrapper[4760]: I0930 09:17:22.339642 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-l4g6m_cd75a50f-b3a1-4bef-ac18-e574ef6815ec/manager/0.log" Sep 30 09:17:22 crc kubenswrapper[4760]: I0930 09:17:22.465155 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c459b467f-xxhnp_08bd2560-a223-4d1d-abf6-cf3686f1ded2/manager/0.log" Sep 30 09:17:37 crc kubenswrapper[4760]: I0930 09:17:37.121834 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-tlrrq_a96a9516-5f80-4391-a1f2-f4b7531e65fa/control-plane-machine-set-operator/0.log" Sep 30 09:17:37 crc kubenswrapper[4760]: I0930 09:17:37.289030 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4lv9x_2fb43f32-6ad4-4450-8a05-80570020d5e8/kube-rbac-proxy/0.log" Sep 30 09:17:37 crc kubenswrapper[4760]: I0930 09:17:37.335787 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4lv9x_2fb43f32-6ad4-4450-8a05-80570020d5e8/machine-api-operator/0.log" Sep 30 09:17:48 crc kubenswrapper[4760]: I0930 09:17:48.968779 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-7f49c_6c1a0dc3-5f08-4216-99c7-ef1889df0775/cert-manager-controller/0.log" Sep 30 09:17:49 crc kubenswrapper[4760]: I0930 09:17:49.084628 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-9kkx2_59941c26-7746-44e9-8453-21d64dbdb91b/cert-manager-cainjector/0.log" Sep 30 09:17:49 crc kubenswrapper[4760]: I0930 09:17:49.159849 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-zptcg_49a05254-e89a-4b7c-b128-0a50daab0f7d/cert-manager-webhook/0.log" Sep 30 09:18:01 crc kubenswrapper[4760]: I0930 09:18:01.220721 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-9kztb_a755b893-8456-4ee5-88cd-6e38a665c659/nmstate-console-plugin/0.log" Sep 30 09:18:01 crc kubenswrapper[4760]: I0930 09:18:01.448109 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-fqhzn_9c04978e-fe0a-4324-b6ce-b9b6b70bf305/nmstate-handler/0.log" Sep 30 09:18:01 crc kubenswrapper[4760]: I0930 09:18:01.482656 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-lt9bl_5f924622-9974-450d-b3a1-bb5fc8100ad6/kube-rbac-proxy/0.log" Sep 30 09:18:01 crc kubenswrapper[4760]: I0930 09:18:01.534597 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-lt9bl_5f924622-9974-450d-b3a1-bb5fc8100ad6/nmstate-metrics/0.log" Sep 30 09:18:01 crc kubenswrapper[4760]: I0930 09:18:01.683862 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-g68gd_40eb0a4f-5fde-42fa-a5c0-283ccab9a683/nmstate-operator/0.log" Sep 30 09:18:01 crc kubenswrapper[4760]: I0930 09:18:01.760143 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-p594d_759b23d3-847f-4d3a-9141-5c2cfad8664b/nmstate-webhook/0.log" Sep 30 09:18:13 crc kubenswrapper[4760]: I0930 09:18:13.937083 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ptpvl"] Sep 30 09:18:13 crc kubenswrapper[4760]: E0930 09:18:13.938287 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99def55d-dfeb-4eea-ab3c-e2fb4ce88346" containerName="container-00" Sep 30 09:18:13 crc kubenswrapper[4760]: I0930 09:18:13.938327 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="99def55d-dfeb-4eea-ab3c-e2fb4ce88346" containerName="container-00" Sep 30 09:18:13 crc kubenswrapper[4760]: I0930 09:18:13.938624 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="99def55d-dfeb-4eea-ab3c-e2fb4ce88346" containerName="container-00" Sep 30 09:18:13 crc kubenswrapper[4760]: I0930 09:18:13.940656 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ptpvl" Sep 30 09:18:13 crc kubenswrapper[4760]: I0930 09:18:13.957454 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptpvl"] Sep 30 09:18:14 crc kubenswrapper[4760]: I0930 09:18:14.079197 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb8ffd63-30ae-4174-951e-587b5d657cfc-catalog-content\") pod \"redhat-marketplace-ptpvl\" (UID: \"fb8ffd63-30ae-4174-951e-587b5d657cfc\") " pod="openshift-marketplace/redhat-marketplace-ptpvl" Sep 30 09:18:14 crc kubenswrapper[4760]: I0930 09:18:14.079719 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzs8f\" (UniqueName: \"kubernetes.io/projected/fb8ffd63-30ae-4174-951e-587b5d657cfc-kube-api-access-wzs8f\") pod \"redhat-marketplace-ptpvl\" (UID: \"fb8ffd63-30ae-4174-951e-587b5d657cfc\") " pod="openshift-marketplace/redhat-marketplace-ptpvl" Sep 30 09:18:14 crc kubenswrapper[4760]: I0930 09:18:14.079791 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb8ffd63-30ae-4174-951e-587b5d657cfc-utilities\") pod \"redhat-marketplace-ptpvl\" (UID: \"fb8ffd63-30ae-4174-951e-587b5d657cfc\") " pod="openshift-marketplace/redhat-marketplace-ptpvl" Sep 30 09:18:14 crc kubenswrapper[4760]: I0930 09:18:14.181588 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb8ffd63-30ae-4174-951e-587b5d657cfc-catalog-content\") pod \"redhat-marketplace-ptpvl\" (UID: \"fb8ffd63-30ae-4174-951e-587b5d657cfc\") " pod="openshift-marketplace/redhat-marketplace-ptpvl" Sep 30 09:18:14 crc kubenswrapper[4760]: I0930 09:18:14.181687 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzs8f\" (UniqueName: \"kubernetes.io/projected/fb8ffd63-30ae-4174-951e-587b5d657cfc-kube-api-access-wzs8f\") pod \"redhat-marketplace-ptpvl\" (UID: \"fb8ffd63-30ae-4174-951e-587b5d657cfc\") " pod="openshift-marketplace/redhat-marketplace-ptpvl" Sep 30 09:18:14 crc kubenswrapper[4760]: I0930 09:18:14.181721 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb8ffd63-30ae-4174-951e-587b5d657cfc-utilities\") pod \"redhat-marketplace-ptpvl\" (UID: \"fb8ffd63-30ae-4174-951e-587b5d657cfc\") " pod="openshift-marketplace/redhat-marketplace-ptpvl" Sep 30 09:18:14 crc kubenswrapper[4760]: I0930 09:18:14.182785 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb8ffd63-30ae-4174-951e-587b5d657cfc-utilities\") pod \"redhat-marketplace-ptpvl\" (UID: \"fb8ffd63-30ae-4174-951e-587b5d657cfc\") " pod="openshift-marketplace/redhat-marketplace-ptpvl" Sep 30 09:18:14 crc kubenswrapper[4760]: I0930 09:18:14.182882 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb8ffd63-30ae-4174-951e-587b5d657cfc-catalog-content\") pod \"redhat-marketplace-ptpvl\" (UID: \"fb8ffd63-30ae-4174-951e-587b5d657cfc\") " pod="openshift-marketplace/redhat-marketplace-ptpvl" Sep 30 09:18:14 crc kubenswrapper[4760]: I0930 09:18:14.205596 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzs8f\" (UniqueName: \"kubernetes.io/projected/fb8ffd63-30ae-4174-951e-587b5d657cfc-kube-api-access-wzs8f\") pod \"redhat-marketplace-ptpvl\" (UID: \"fb8ffd63-30ae-4174-951e-587b5d657cfc\") " pod="openshift-marketplace/redhat-marketplace-ptpvl" Sep 30 09:18:14 crc kubenswrapper[4760]: I0930 09:18:14.275703 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ptpvl" Sep 30 09:18:14 crc kubenswrapper[4760]: I0930 09:18:14.531776 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-stc4g"] Sep 30 09:18:14 crc kubenswrapper[4760]: I0930 09:18:14.534570 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-stc4g" Sep 30 09:18:14 crc kubenswrapper[4760]: I0930 09:18:14.547049 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-stc4g"] Sep 30 09:18:14 crc kubenswrapper[4760]: I0930 09:18:14.590962 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsxlf\" (UniqueName: \"kubernetes.io/projected/9d4a0ebe-b5b4-498e-81b6-5e7062d6d151-kube-api-access-fsxlf\") pod \"certified-operators-stc4g\" (UID: \"9d4a0ebe-b5b4-498e-81b6-5e7062d6d151\") " pod="openshift-marketplace/certified-operators-stc4g" Sep 30 09:18:14 crc kubenswrapper[4760]: I0930 09:18:14.591053 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d4a0ebe-b5b4-498e-81b6-5e7062d6d151-utilities\") pod \"certified-operators-stc4g\" (UID: \"9d4a0ebe-b5b4-498e-81b6-5e7062d6d151\") " pod="openshift-marketplace/certified-operators-stc4g" Sep 30 09:18:14 crc kubenswrapper[4760]: I0930 09:18:14.591118 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d4a0ebe-b5b4-498e-81b6-5e7062d6d151-catalog-content\") pod \"certified-operators-stc4g\" (UID: \"9d4a0ebe-b5b4-498e-81b6-5e7062d6d151\") " pod="openshift-marketplace/certified-operators-stc4g" Sep 30 09:18:14 crc kubenswrapper[4760]: I0930 09:18:14.692837 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsxlf\" (UniqueName: \"kubernetes.io/projected/9d4a0ebe-b5b4-498e-81b6-5e7062d6d151-kube-api-access-fsxlf\") pod \"certified-operators-stc4g\" (UID: \"9d4a0ebe-b5b4-498e-81b6-5e7062d6d151\") " pod="openshift-marketplace/certified-operators-stc4g" Sep 30 09:18:14 crc kubenswrapper[4760]: I0930 09:18:14.692945 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d4a0ebe-b5b4-498e-81b6-5e7062d6d151-utilities\") pod \"certified-operators-stc4g\" (UID: \"9d4a0ebe-b5b4-498e-81b6-5e7062d6d151\") " pod="openshift-marketplace/certified-operators-stc4g" Sep 30 09:18:14 crc kubenswrapper[4760]: I0930 09:18:14.693007 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d4a0ebe-b5b4-498e-81b6-5e7062d6d151-catalog-content\") pod \"certified-operators-stc4g\" (UID: \"9d4a0ebe-b5b4-498e-81b6-5e7062d6d151\") " pod="openshift-marketplace/certified-operators-stc4g" Sep 30 09:18:14 crc kubenswrapper[4760]: I0930 09:18:14.694018 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d4a0ebe-b5b4-498e-81b6-5e7062d6d151-catalog-content\") pod \"certified-operators-stc4g\" (UID: \"9d4a0ebe-b5b4-498e-81b6-5e7062d6d151\") " pod="openshift-marketplace/certified-operators-stc4g" Sep 30 09:18:14 crc kubenswrapper[4760]: I0930 09:18:14.694134 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d4a0ebe-b5b4-498e-81b6-5e7062d6d151-utilities\") pod \"certified-operators-stc4g\" (UID: \"9d4a0ebe-b5b4-498e-81b6-5e7062d6d151\") " pod="openshift-marketplace/certified-operators-stc4g" Sep 30 09:18:14 crc kubenswrapper[4760]: I0930 09:18:14.712918 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsxlf\" (UniqueName: \"kubernetes.io/projected/9d4a0ebe-b5b4-498e-81b6-5e7062d6d151-kube-api-access-fsxlf\") pod \"certified-operators-stc4g\" (UID: \"9d4a0ebe-b5b4-498e-81b6-5e7062d6d151\") " pod="openshift-marketplace/certified-operators-stc4g" Sep 30 09:18:14 crc kubenswrapper[4760]: I0930 09:18:14.755674 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptpvl"] Sep 30 09:18:14 crc kubenswrapper[4760]: I0930 09:18:14.861631 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-stc4g" Sep 30 09:18:15 crc kubenswrapper[4760]: I0930 09:18:15.420085 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-stc4g"] Sep 30 09:18:15 crc kubenswrapper[4760]: I0930 09:18:15.692445 4760 generic.go:334] "Generic (PLEG): container finished" podID="fb8ffd63-30ae-4174-951e-587b5d657cfc" containerID="8a230c7d56d7d879360868c65df144fc32769468da8ceeb8199edac631dc5262" exitCode=0 Sep 30 09:18:15 crc kubenswrapper[4760]: I0930 09:18:15.692560 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptpvl" event={"ID":"fb8ffd63-30ae-4174-951e-587b5d657cfc","Type":"ContainerDied","Data":"8a230c7d56d7d879360868c65df144fc32769468da8ceeb8199edac631dc5262"} Sep 30 09:18:15 crc kubenswrapper[4760]: I0930 09:18:15.692887 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptpvl" event={"ID":"fb8ffd63-30ae-4174-951e-587b5d657cfc","Type":"ContainerStarted","Data":"113e5feb7de30fd4f2ccef0fa4dfd5d9680c8af8e21a91f80d839fa9ab78b94d"} Sep 30 09:18:15 crc kubenswrapper[4760]: I0930 09:18:15.694613 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 09:18:15 crc kubenswrapper[4760]: I0930 09:18:15.695203 4760 generic.go:334] "Generic (PLEG): container finished" podID="9d4a0ebe-b5b4-498e-81b6-5e7062d6d151" containerID="89675e08d7fe061bc5f890bdcba453bd8a8101b84601b534af70cb8e4277abba" exitCode=0 Sep 30 09:18:15 crc kubenswrapper[4760]: I0930 09:18:15.695244 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-stc4g" event={"ID":"9d4a0ebe-b5b4-498e-81b6-5e7062d6d151","Type":"ContainerDied","Data":"89675e08d7fe061bc5f890bdcba453bd8a8101b84601b534af70cb8e4277abba"} Sep 30 09:18:15 crc kubenswrapper[4760]: I0930 09:18:15.695284 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-stc4g" event={"ID":"9d4a0ebe-b5b4-498e-81b6-5e7062d6d151","Type":"ContainerStarted","Data":"29a0a541a87c13098bd916525ea99c5ba28c543f6b2f50f2d5b49648f5b4d0bf"} Sep 30 09:18:16 crc kubenswrapper[4760]: I0930 09:18:16.196656 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-ldwsd_670bf4cb-7ea4-4ffb-af92-0f727878a518/kube-rbac-proxy/0.log" Sep 30 09:18:16 crc kubenswrapper[4760]: I0930 09:18:16.455355 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-ldwsd_670bf4cb-7ea4-4ffb-af92-0f727878a518/controller/0.log" Sep 30 09:18:16 crc kubenswrapper[4760]: I0930 09:18:16.498851 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/cp-frr-files/0.log" Sep 30 09:18:16 crc kubenswrapper[4760]: I0930 09:18:16.689780 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/cp-reloader/0.log" Sep 30 09:18:16 crc kubenswrapper[4760]: I0930 09:18:16.702676 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/cp-frr-files/0.log" Sep 30 09:18:16 crc kubenswrapper[4760]: I0930 09:18:16.730907 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/cp-reloader/0.log" Sep 30 09:18:16 crc kubenswrapper[4760]: I0930 09:18:16.734596 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/cp-metrics/0.log" Sep 30 09:18:16 crc kubenswrapper[4760]: I0930 09:18:16.864619 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/cp-frr-files/0.log" Sep 30 09:18:16 crc kubenswrapper[4760]: I0930 09:18:16.883222 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/cp-reloader/0.log" Sep 30 09:18:16 crc kubenswrapper[4760]: I0930 09:18:16.923555 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/cp-metrics/0.log" Sep 30 09:18:16 crc kubenswrapper[4760]: I0930 09:18:16.965663 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/cp-metrics/0.log" Sep 30 09:18:17 crc kubenswrapper[4760]: I0930 09:18:17.088331 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/cp-frr-files/0.log" Sep 30 09:18:17 crc kubenswrapper[4760]: I0930 09:18:17.112437 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/cp-metrics/0.log" Sep 30 09:18:17 crc kubenswrapper[4760]: I0930 09:18:17.136185 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/cp-reloader/0.log" Sep 30 09:18:17 crc kubenswrapper[4760]: I0930 09:18:17.164497 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/controller/0.log" Sep 30 09:18:17 crc kubenswrapper[4760]: I0930 09:18:17.320885 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/frr-metrics/0.log" Sep 30 09:18:17 crc kubenswrapper[4760]: I0930 09:18:17.343869 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/kube-rbac-proxy/0.log" Sep 30 09:18:17 crc kubenswrapper[4760]: I0930 09:18:17.392282 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/kube-rbac-proxy-frr/0.log" Sep 30 09:18:17 crc kubenswrapper[4760]: I0930 09:18:17.559849 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/reloader/0.log" Sep 30 09:18:17 crc kubenswrapper[4760]: I0930 09:18:17.650738 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-69f49_ffa35c7d-6788-4902-8863-7346389154cd/frr-k8s-webhook-server/0.log" Sep 30 09:18:17 crc kubenswrapper[4760]: I0930 09:18:17.724360 4760 generic.go:334] "Generic (PLEG): container finished" podID="fb8ffd63-30ae-4174-951e-587b5d657cfc" containerID="38476f9025da89838f5ba1d2fcbd8ec6f87c990f85226f705d3acf646a70d9b8" exitCode=0 Sep 30 09:18:17 crc kubenswrapper[4760]: I0930 09:18:17.724415 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptpvl" event={"ID":"fb8ffd63-30ae-4174-951e-587b5d657cfc","Type":"ContainerDied","Data":"38476f9025da89838f5ba1d2fcbd8ec6f87c990f85226f705d3acf646a70d9b8"} Sep 30 09:18:17 crc kubenswrapper[4760]: I0930 09:18:17.726851 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-stc4g" event={"ID":"9d4a0ebe-b5b4-498e-81b6-5e7062d6d151","Type":"ContainerStarted","Data":"4de845206427af6afd20d1bea13bf90fc8856c5428d6234425b95415669984ba"} Sep 30 09:18:17 crc kubenswrapper[4760]: I0930 09:18:17.780665 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7db464cf7c-k5lfr_900aa033-c62f-42f8-a964-9d0e113eca21/manager/0.log" Sep 30 09:18:17 crc kubenswrapper[4760]: I0930 09:18:17.943827 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-54c4f7bf85-ndtrq_be9587c7-9bbb-48ad-867a-1830129f24b3/webhook-server/0.log" Sep 30 09:18:18 crc kubenswrapper[4760]: I0930 09:18:18.065374 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ldvzg_e89684e7-e01f-4427-9479-999c5f101902/kube-rbac-proxy/0.log" Sep 30 09:18:18 crc kubenswrapper[4760]: I0930 09:18:18.739234 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptpvl" event={"ID":"fb8ffd63-30ae-4174-951e-587b5d657cfc","Type":"ContainerStarted","Data":"45eb51a662c5ec2f948710ecd933edea756c6f79412ed86faaac1a68d94ef9e9"} Sep 30 09:18:18 crc kubenswrapper[4760]: I0930 09:18:18.743860 4760 generic.go:334] "Generic (PLEG): container finished" podID="9d4a0ebe-b5b4-498e-81b6-5e7062d6d151" containerID="4de845206427af6afd20d1bea13bf90fc8856c5428d6234425b95415669984ba" exitCode=0 Sep 30 09:18:18 crc kubenswrapper[4760]: I0930 09:18:18.743903 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-stc4g" event={"ID":"9d4a0ebe-b5b4-498e-81b6-5e7062d6d151","Type":"ContainerDied","Data":"4de845206427af6afd20d1bea13bf90fc8856c5428d6234425b95415669984ba"} Sep 30 09:18:18 crc kubenswrapper[4760]: I0930 09:18:18.775352 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ptpvl" podStartSLOduration=3.204309782 podStartE2EDuration="5.775320569s" podCreationTimestamp="2025-09-30 09:18:13 +0000 UTC" firstStartedPulling="2025-09-30 09:18:15.694206171 +0000 UTC m=+6281.337112593" lastFinishedPulling="2025-09-30 09:18:18.265216968 +0000 UTC m=+6283.908123380" observedRunningTime="2025-09-30 09:18:18.765830348 +0000 UTC m=+6284.408736760" watchObservedRunningTime="2025-09-30 09:18:18.775320569 +0000 UTC m=+6284.418226981" Sep 30 09:18:18 crc kubenswrapper[4760]: I0930 09:18:18.801557 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ldvzg_e89684e7-e01f-4427-9479-999c5f101902/speaker/0.log" Sep 30 09:18:18 crc kubenswrapper[4760]: I0930 09:18:18.934037 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tqbpr_213dbc76-320c-463f-8133-946be9ece565/frr/0.log" Sep 30 09:18:19 crc kubenswrapper[4760]: I0930 09:18:19.754515 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-stc4g" event={"ID":"9d4a0ebe-b5b4-498e-81b6-5e7062d6d151","Type":"ContainerStarted","Data":"dc9d3029d094f09c44be2002dfb0cdcec3314104d36699e3066bd908460a9e32"} Sep 30 09:18:19 crc kubenswrapper[4760]: I0930 09:18:19.781515 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-stc4g" podStartSLOduration=2.250589529 podStartE2EDuration="5.781497175s" podCreationTimestamp="2025-09-30 09:18:14 +0000 UTC" firstStartedPulling="2025-09-30 09:18:15.696870569 +0000 UTC m=+6281.339776991" lastFinishedPulling="2025-09-30 09:18:19.227778225 +0000 UTC m=+6284.870684637" observedRunningTime="2025-09-30 09:18:19.77302125 +0000 UTC m=+6285.415927672" watchObservedRunningTime="2025-09-30 09:18:19.781497175 +0000 UTC m=+6285.424403587" Sep 30 09:18:24 crc kubenswrapper[4760]: I0930 09:18:24.275853 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ptpvl" Sep 30 09:18:24 crc kubenswrapper[4760]: I0930 09:18:24.276338 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ptpvl" Sep 30 09:18:24 crc kubenswrapper[4760]: I0930 09:18:24.357916 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ptpvl" Sep 30 09:18:24 crc kubenswrapper[4760]: I0930 09:18:24.862753 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-stc4g" Sep 30 09:18:24 crc kubenswrapper[4760]: I0930 09:18:24.865767 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-stc4g" Sep 30 09:18:24 crc kubenswrapper[4760]: I0930 09:18:24.895030 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ptpvl" Sep 30 09:18:24 crc kubenswrapper[4760]: I0930 09:18:24.929260 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-stc4g" Sep 30 09:18:24 crc kubenswrapper[4760]: I0930 09:18:24.965493 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptpvl"] Sep 30 09:18:25 crc kubenswrapper[4760]: I0930 09:18:25.874356 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-stc4g" Sep 30 09:18:26 crc kubenswrapper[4760]: I0930 09:18:26.832337 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ptpvl" podUID="fb8ffd63-30ae-4174-951e-587b5d657cfc" containerName="registry-server" containerID="cri-o://45eb51a662c5ec2f948710ecd933edea756c6f79412ed86faaac1a68d94ef9e9" gracePeriod=2 Sep 30 09:18:27 crc kubenswrapper[4760]: I0930 09:18:27.303262 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ptpvl" Sep 30 09:18:27 crc kubenswrapper[4760]: I0930 09:18:27.329552 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-stc4g"] Sep 30 09:18:27 crc kubenswrapper[4760]: I0930 09:18:27.489809 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb8ffd63-30ae-4174-951e-587b5d657cfc-catalog-content\") pod \"fb8ffd63-30ae-4174-951e-587b5d657cfc\" (UID: \"fb8ffd63-30ae-4174-951e-587b5d657cfc\") " Sep 30 09:18:27 crc kubenswrapper[4760]: I0930 09:18:27.489875 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzs8f\" (UniqueName: \"kubernetes.io/projected/fb8ffd63-30ae-4174-951e-587b5d657cfc-kube-api-access-wzs8f\") pod \"fb8ffd63-30ae-4174-951e-587b5d657cfc\" (UID: \"fb8ffd63-30ae-4174-951e-587b5d657cfc\") " Sep 30 09:18:27 crc kubenswrapper[4760]: I0930 09:18:27.489909 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb8ffd63-30ae-4174-951e-587b5d657cfc-utilities\") pod \"fb8ffd63-30ae-4174-951e-587b5d657cfc\" (UID: \"fb8ffd63-30ae-4174-951e-587b5d657cfc\") " Sep 30 09:18:27 crc kubenswrapper[4760]: I0930 09:18:27.490809 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb8ffd63-30ae-4174-951e-587b5d657cfc-utilities" (OuterVolumeSpecName: "utilities") pod "fb8ffd63-30ae-4174-951e-587b5d657cfc" (UID: "fb8ffd63-30ae-4174-951e-587b5d657cfc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 09:18:27 crc kubenswrapper[4760]: I0930 09:18:27.502433 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb8ffd63-30ae-4174-951e-587b5d657cfc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb8ffd63-30ae-4174-951e-587b5d657cfc" (UID: "fb8ffd63-30ae-4174-951e-587b5d657cfc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 09:18:27 crc kubenswrapper[4760]: I0930 09:18:27.505070 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb8ffd63-30ae-4174-951e-587b5d657cfc-kube-api-access-wzs8f" (OuterVolumeSpecName: "kube-api-access-wzs8f") pod "fb8ffd63-30ae-4174-951e-587b5d657cfc" (UID: "fb8ffd63-30ae-4174-951e-587b5d657cfc"). InnerVolumeSpecName "kube-api-access-wzs8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 09:18:27 crc kubenswrapper[4760]: I0930 09:18:27.592526 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb8ffd63-30ae-4174-951e-587b5d657cfc-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 09:18:27 crc kubenswrapper[4760]: I0930 09:18:27.592576 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzs8f\" (UniqueName: \"kubernetes.io/projected/fb8ffd63-30ae-4174-951e-587b5d657cfc-kube-api-access-wzs8f\") on node \"crc\" DevicePath \"\"" Sep 30 09:18:27 crc kubenswrapper[4760]: I0930 09:18:27.592591 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb8ffd63-30ae-4174-951e-587b5d657cfc-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 09:18:27 crc kubenswrapper[4760]: I0930 09:18:27.844267 4760 generic.go:334] "Generic (PLEG): container finished" podID="fb8ffd63-30ae-4174-951e-587b5d657cfc" containerID="45eb51a662c5ec2f948710ecd933edea756c6f79412ed86faaac1a68d94ef9e9" exitCode=0 Sep 30 09:18:27 crc kubenswrapper[4760]: I0930 09:18:27.844346 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptpvl" event={"ID":"fb8ffd63-30ae-4174-951e-587b5d657cfc","Type":"ContainerDied","Data":"45eb51a662c5ec2f948710ecd933edea756c6f79412ed86faaac1a68d94ef9e9"} Sep 30 09:18:27 crc kubenswrapper[4760]: I0930 09:18:27.844789 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptpvl" event={"ID":"fb8ffd63-30ae-4174-951e-587b5d657cfc","Type":"ContainerDied","Data":"113e5feb7de30fd4f2ccef0fa4dfd5d9680c8af8e21a91f80d839fa9ab78b94d"} Sep 30 09:18:27 crc kubenswrapper[4760]: I0930 09:18:27.844821 4760 scope.go:117] "RemoveContainer" containerID="45eb51a662c5ec2f948710ecd933edea756c6f79412ed86faaac1a68d94ef9e9" Sep 30 09:18:27 crc kubenswrapper[4760]: I0930 09:18:27.844376 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ptpvl" Sep 30 09:18:27 crc kubenswrapper[4760]: I0930 09:18:27.888442 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptpvl"] Sep 30 09:18:27 crc kubenswrapper[4760]: I0930 09:18:27.891179 4760 scope.go:117] "RemoveContainer" containerID="38476f9025da89838f5ba1d2fcbd8ec6f87c990f85226f705d3acf646a70d9b8" Sep 30 09:18:27 crc kubenswrapper[4760]: I0930 09:18:27.922722 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptpvl"] Sep 30 09:18:27 crc kubenswrapper[4760]: I0930 09:18:27.938021 4760 scope.go:117] "RemoveContainer" containerID="8a230c7d56d7d879360868c65df144fc32769468da8ceeb8199edac631dc5262" Sep 30 09:18:27 crc kubenswrapper[4760]: I0930 09:18:27.965846 4760 scope.go:117] "RemoveContainer" containerID="45eb51a662c5ec2f948710ecd933edea756c6f79412ed86faaac1a68d94ef9e9" Sep 30 09:18:27 crc kubenswrapper[4760]: E0930 09:18:27.966417 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45eb51a662c5ec2f948710ecd933edea756c6f79412ed86faaac1a68d94ef9e9\": container with ID starting with 45eb51a662c5ec2f948710ecd933edea756c6f79412ed86faaac1a68d94ef9e9 not found: ID does not exist" containerID="45eb51a662c5ec2f948710ecd933edea756c6f79412ed86faaac1a68d94ef9e9" Sep 30 09:18:27 crc kubenswrapper[4760]: I0930 09:18:27.966447 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45eb51a662c5ec2f948710ecd933edea756c6f79412ed86faaac1a68d94ef9e9"} err="failed to get container status \"45eb51a662c5ec2f948710ecd933edea756c6f79412ed86faaac1a68d94ef9e9\": rpc error: code = NotFound desc = could not find container \"45eb51a662c5ec2f948710ecd933edea756c6f79412ed86faaac1a68d94ef9e9\": container with ID starting with 45eb51a662c5ec2f948710ecd933edea756c6f79412ed86faaac1a68d94ef9e9 not found: ID does not exist" Sep 30 09:18:27 crc kubenswrapper[4760]: I0930 09:18:27.966467 4760 scope.go:117] "RemoveContainer" containerID="38476f9025da89838f5ba1d2fcbd8ec6f87c990f85226f705d3acf646a70d9b8" Sep 30 09:18:27 crc kubenswrapper[4760]: E0930 09:18:27.966746 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38476f9025da89838f5ba1d2fcbd8ec6f87c990f85226f705d3acf646a70d9b8\": container with ID starting with 38476f9025da89838f5ba1d2fcbd8ec6f87c990f85226f705d3acf646a70d9b8 not found: ID does not exist" containerID="38476f9025da89838f5ba1d2fcbd8ec6f87c990f85226f705d3acf646a70d9b8" Sep 30 09:18:27 crc kubenswrapper[4760]: I0930 09:18:27.966766 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38476f9025da89838f5ba1d2fcbd8ec6f87c990f85226f705d3acf646a70d9b8"} err="failed to get container status \"38476f9025da89838f5ba1d2fcbd8ec6f87c990f85226f705d3acf646a70d9b8\": rpc error: code = NotFound desc = could not find container \"38476f9025da89838f5ba1d2fcbd8ec6f87c990f85226f705d3acf646a70d9b8\": container with ID starting with 38476f9025da89838f5ba1d2fcbd8ec6f87c990f85226f705d3acf646a70d9b8 not found: ID does not exist" Sep 30 09:18:27 crc kubenswrapper[4760]: I0930 09:18:27.966779 4760 scope.go:117] "RemoveContainer" containerID="8a230c7d56d7d879360868c65df144fc32769468da8ceeb8199edac631dc5262" Sep 30 09:18:27 crc kubenswrapper[4760]: E0930 09:18:27.967032 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a230c7d56d7d879360868c65df144fc32769468da8ceeb8199edac631dc5262\": container with ID starting with 8a230c7d56d7d879360868c65df144fc32769468da8ceeb8199edac631dc5262 not found: ID does not exist" containerID="8a230c7d56d7d879360868c65df144fc32769468da8ceeb8199edac631dc5262" Sep 30 09:18:27 crc kubenswrapper[4760]: I0930 09:18:27.967053 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a230c7d56d7d879360868c65df144fc32769468da8ceeb8199edac631dc5262"} err="failed to get container status \"8a230c7d56d7d879360868c65df144fc32769468da8ceeb8199edac631dc5262\": rpc error: code = NotFound desc = could not find container \"8a230c7d56d7d879360868c65df144fc32769468da8ceeb8199edac631dc5262\": container with ID starting with 8a230c7d56d7d879360868c65df144fc32769468da8ceeb8199edac631dc5262 not found: ID does not exist" Sep 30 09:18:28 crc kubenswrapper[4760]: I0930 09:18:28.855358 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-stc4g" podUID="9d4a0ebe-b5b4-498e-81b6-5e7062d6d151" containerName="registry-server" containerID="cri-o://dc9d3029d094f09c44be2002dfb0cdcec3314104d36699e3066bd908460a9e32" gracePeriod=2 Sep 30 09:18:29 crc kubenswrapper[4760]: I0930 09:18:29.080961 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb8ffd63-30ae-4174-951e-587b5d657cfc" path="/var/lib/kubelet/pods/fb8ffd63-30ae-4174-951e-587b5d657cfc/volumes" Sep 30 09:18:29 crc kubenswrapper[4760]: I0930 09:18:29.326562 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-stc4g" Sep 30 09:18:29 crc kubenswrapper[4760]: I0930 09:18:29.427144 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d4a0ebe-b5b4-498e-81b6-5e7062d6d151-catalog-content\") pod \"9d4a0ebe-b5b4-498e-81b6-5e7062d6d151\" (UID: \"9d4a0ebe-b5b4-498e-81b6-5e7062d6d151\") " Sep 30 09:18:29 crc kubenswrapper[4760]: I0930 09:18:29.427461 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d4a0ebe-b5b4-498e-81b6-5e7062d6d151-utilities\") pod \"9d4a0ebe-b5b4-498e-81b6-5e7062d6d151\" (UID: \"9d4a0ebe-b5b4-498e-81b6-5e7062d6d151\") " Sep 30 09:18:29 crc kubenswrapper[4760]: I0930 09:18:29.427762 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsxlf\" (UniqueName: \"kubernetes.io/projected/9d4a0ebe-b5b4-498e-81b6-5e7062d6d151-kube-api-access-fsxlf\") pod \"9d4a0ebe-b5b4-498e-81b6-5e7062d6d151\" (UID: \"9d4a0ebe-b5b4-498e-81b6-5e7062d6d151\") " Sep 30 09:18:29 crc kubenswrapper[4760]: I0930 09:18:29.428262 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d4a0ebe-b5b4-498e-81b6-5e7062d6d151-utilities" (OuterVolumeSpecName: "utilities") pod "9d4a0ebe-b5b4-498e-81b6-5e7062d6d151" (UID: "9d4a0ebe-b5b4-498e-81b6-5e7062d6d151"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 09:18:29 crc kubenswrapper[4760]: I0930 09:18:29.428751 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d4a0ebe-b5b4-498e-81b6-5e7062d6d151-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 09:18:29 crc kubenswrapper[4760]: I0930 09:18:29.450486 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4a0ebe-b5b4-498e-81b6-5e7062d6d151-kube-api-access-fsxlf" (OuterVolumeSpecName: "kube-api-access-fsxlf") pod "9d4a0ebe-b5b4-498e-81b6-5e7062d6d151" (UID: "9d4a0ebe-b5b4-498e-81b6-5e7062d6d151"). InnerVolumeSpecName "kube-api-access-fsxlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 09:18:29 crc kubenswrapper[4760]: I0930 09:18:29.475336 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d4a0ebe-b5b4-498e-81b6-5e7062d6d151-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d4a0ebe-b5b4-498e-81b6-5e7062d6d151" (UID: "9d4a0ebe-b5b4-498e-81b6-5e7062d6d151"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 09:18:29 crc kubenswrapper[4760]: I0930 09:18:29.530841 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsxlf\" (UniqueName: \"kubernetes.io/projected/9d4a0ebe-b5b4-498e-81b6-5e7062d6d151-kube-api-access-fsxlf\") on node \"crc\" DevicePath \"\"" Sep 30 09:18:29 crc kubenswrapper[4760]: I0930 09:18:29.530890 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d4a0ebe-b5b4-498e-81b6-5e7062d6d151-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 09:18:29 crc kubenswrapper[4760]: I0930 09:18:29.869039 4760 generic.go:334] "Generic (PLEG): container finished" podID="9d4a0ebe-b5b4-498e-81b6-5e7062d6d151" containerID="dc9d3029d094f09c44be2002dfb0cdcec3314104d36699e3066bd908460a9e32" exitCode=0 Sep 30 09:18:29 crc kubenswrapper[4760]: I0930 09:18:29.869086 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-stc4g" event={"ID":"9d4a0ebe-b5b4-498e-81b6-5e7062d6d151","Type":"ContainerDied","Data":"dc9d3029d094f09c44be2002dfb0cdcec3314104d36699e3066bd908460a9e32"} Sep 30 09:18:29 crc kubenswrapper[4760]: I0930 09:18:29.869174 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-stc4g" event={"ID":"9d4a0ebe-b5b4-498e-81b6-5e7062d6d151","Type":"ContainerDied","Data":"29a0a541a87c13098bd916525ea99c5ba28c543f6b2f50f2d5b49648f5b4d0bf"} Sep 30 09:18:29 crc kubenswrapper[4760]: I0930 09:18:29.869214 4760 scope.go:117] "RemoveContainer" containerID="dc9d3029d094f09c44be2002dfb0cdcec3314104d36699e3066bd908460a9e32" Sep 30 09:18:29 crc kubenswrapper[4760]: I0930 09:18:29.869126 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-stc4g" Sep 30 09:18:29 crc kubenswrapper[4760]: I0930 09:18:29.894098 4760 scope.go:117] "RemoveContainer" containerID="4de845206427af6afd20d1bea13bf90fc8856c5428d6234425b95415669984ba" Sep 30 09:18:29 crc kubenswrapper[4760]: I0930 09:18:29.919136 4760 scope.go:117] "RemoveContainer" containerID="89675e08d7fe061bc5f890bdcba453bd8a8101b84601b534af70cb8e4277abba" Sep 30 09:18:29 crc kubenswrapper[4760]: I0930 09:18:29.927805 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-stc4g"] Sep 30 09:18:29 crc kubenswrapper[4760]: I0930 09:18:29.935112 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-stc4g"] Sep 30 09:18:30 crc kubenswrapper[4760]: I0930 09:18:30.013211 4760 scope.go:117] "RemoveContainer" containerID="dc9d3029d094f09c44be2002dfb0cdcec3314104d36699e3066bd908460a9e32" Sep 30 09:18:30 crc kubenswrapper[4760]: E0930 09:18:30.014946 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc9d3029d094f09c44be2002dfb0cdcec3314104d36699e3066bd908460a9e32\": container with ID starting with dc9d3029d094f09c44be2002dfb0cdcec3314104d36699e3066bd908460a9e32 not found: ID does not exist" containerID="dc9d3029d094f09c44be2002dfb0cdcec3314104d36699e3066bd908460a9e32" Sep 30 09:18:30 crc kubenswrapper[4760]: I0930 09:18:30.015002 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc9d3029d094f09c44be2002dfb0cdcec3314104d36699e3066bd908460a9e32"} err="failed to get container status \"dc9d3029d094f09c44be2002dfb0cdcec3314104d36699e3066bd908460a9e32\": rpc error: code = NotFound desc = could not find container \"dc9d3029d094f09c44be2002dfb0cdcec3314104d36699e3066bd908460a9e32\": container with ID starting with dc9d3029d094f09c44be2002dfb0cdcec3314104d36699e3066bd908460a9e32 not found: ID does not exist" Sep 30 09:18:30 crc kubenswrapper[4760]: I0930 09:18:30.015035 4760 scope.go:117] "RemoveContainer" containerID="4de845206427af6afd20d1bea13bf90fc8856c5428d6234425b95415669984ba" Sep 30 09:18:30 crc kubenswrapper[4760]: E0930 09:18:30.015558 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4de845206427af6afd20d1bea13bf90fc8856c5428d6234425b95415669984ba\": container with ID starting with 4de845206427af6afd20d1bea13bf90fc8856c5428d6234425b95415669984ba not found: ID does not exist" containerID="4de845206427af6afd20d1bea13bf90fc8856c5428d6234425b95415669984ba" Sep 30 09:18:30 crc kubenswrapper[4760]: I0930 09:18:30.015602 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4de845206427af6afd20d1bea13bf90fc8856c5428d6234425b95415669984ba"} err="failed to get container status \"4de845206427af6afd20d1bea13bf90fc8856c5428d6234425b95415669984ba\": rpc error: code = NotFound desc = could not find container \"4de845206427af6afd20d1bea13bf90fc8856c5428d6234425b95415669984ba\": container with ID starting with 4de845206427af6afd20d1bea13bf90fc8856c5428d6234425b95415669984ba not found: ID does not exist" Sep 30 09:18:30 crc kubenswrapper[4760]: I0930 09:18:30.015630 4760 scope.go:117] "RemoveContainer" containerID="89675e08d7fe061bc5f890bdcba453bd8a8101b84601b534af70cb8e4277abba" Sep 30 09:18:30 crc kubenswrapper[4760]: E0930 09:18:30.015892 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89675e08d7fe061bc5f890bdcba453bd8a8101b84601b534af70cb8e4277abba\": container with ID starting with 89675e08d7fe061bc5f890bdcba453bd8a8101b84601b534af70cb8e4277abba not found: ID does not exist" containerID="89675e08d7fe061bc5f890bdcba453bd8a8101b84601b534af70cb8e4277abba" Sep 30 09:18:30 crc kubenswrapper[4760]: I0930 09:18:30.015922 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89675e08d7fe061bc5f890bdcba453bd8a8101b84601b534af70cb8e4277abba"} err="failed to get container status \"89675e08d7fe061bc5f890bdcba453bd8a8101b84601b534af70cb8e4277abba\": rpc error: code = NotFound desc = could not find container \"89675e08d7fe061bc5f890bdcba453bd8a8101b84601b534af70cb8e4277abba\": container with ID starting with 89675e08d7fe061bc5f890bdcba453bd8a8101b84601b534af70cb8e4277abba not found: ID does not exist" Sep 30 09:18:31 crc kubenswrapper[4760]: I0930 09:18:31.077095 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4a0ebe-b5b4-498e-81b6-5e7062d6d151" path="/var/lib/kubelet/pods/9d4a0ebe-b5b4-498e-81b6-5e7062d6d151/volumes" Sep 30 09:18:31 crc kubenswrapper[4760]: I0930 09:18:31.196854 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8_309b7e9d-3273-4a4c-865d-9287bab3988f/util/0.log" Sep 30 09:18:31 crc kubenswrapper[4760]: I0930 09:18:31.531756 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8_309b7e9d-3273-4a4c-865d-9287bab3988f/util/0.log" Sep 30 09:18:31 crc kubenswrapper[4760]: I0930 09:18:31.535531 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8_309b7e9d-3273-4a4c-865d-9287bab3988f/pull/0.log" Sep 30 09:18:31 crc kubenswrapper[4760]: I0930 09:18:31.544316 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8_309b7e9d-3273-4a4c-865d-9287bab3988f/pull/0.log" Sep 30 09:18:31 crc kubenswrapper[4760]: I0930 09:18:31.720916 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8_309b7e9d-3273-4a4c-865d-9287bab3988f/util/0.log" Sep 30 09:18:31 crc kubenswrapper[4760]: I0930 09:18:31.730971 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8_309b7e9d-3273-4a4c-865d-9287bab3988f/pull/0.log" Sep 30 09:18:31 crc kubenswrapper[4760]: I0930 09:18:31.753528 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc9zdf8_309b7e9d-3273-4a4c-865d-9287bab3988f/extract/0.log" Sep 30 09:18:31 crc kubenswrapper[4760]: I0930 09:18:31.956009 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f_32cb0fa2-d830-4589-8379-418cf93913d5/util/0.log" Sep 30 09:18:32 crc kubenswrapper[4760]: I0930 09:18:32.122108 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f_32cb0fa2-d830-4589-8379-418cf93913d5/util/0.log" Sep 30 09:18:32 crc kubenswrapper[4760]: I0930 09:18:32.129055 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f_32cb0fa2-d830-4589-8379-418cf93913d5/pull/0.log" Sep 30 09:18:32 crc kubenswrapper[4760]: I0930 09:18:32.134148 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f_32cb0fa2-d830-4589-8379-418cf93913d5/pull/0.log" Sep 30 09:18:32 crc kubenswrapper[4760]: I0930 09:18:32.327001 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f_32cb0fa2-d830-4589-8379-418cf93913d5/pull/0.log" Sep 30 09:18:32 crc kubenswrapper[4760]: I0930 09:18:32.352992 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f_32cb0fa2-d830-4589-8379-418cf93913d5/extract/0.log" Sep 30 09:18:32 crc kubenswrapper[4760]: I0930 09:18:32.380577 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzf76f_32cb0fa2-d830-4589-8379-418cf93913d5/util/0.log" Sep 30 09:18:32 crc kubenswrapper[4760]: I0930 09:18:32.522370 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nkbp8_3f9a39d0-3377-49d8-b54f-6cfac198199f/extract-utilities/0.log" Sep 30 09:18:32 crc kubenswrapper[4760]: I0930 09:18:32.813950 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nkbp8_3f9a39d0-3377-49d8-b54f-6cfac198199f/extract-utilities/0.log" Sep 30 09:18:32 crc kubenswrapper[4760]: I0930 09:18:32.815271 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nkbp8_3f9a39d0-3377-49d8-b54f-6cfac198199f/extract-content/0.log" Sep 30 09:18:32 crc kubenswrapper[4760]: I0930 09:18:32.862655 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nkbp8_3f9a39d0-3377-49d8-b54f-6cfac198199f/extract-content/0.log" Sep 30 09:18:33 crc kubenswrapper[4760]: I0930 09:18:33.098474 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nkbp8_3f9a39d0-3377-49d8-b54f-6cfac198199f/extract-content/0.log" Sep 30 09:18:33 crc kubenswrapper[4760]: I0930 09:18:33.198352 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nkbp8_3f9a39d0-3377-49d8-b54f-6cfac198199f/extract-utilities/0.log" Sep 30 09:18:33 crc kubenswrapper[4760]: I0930 09:18:33.345495 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zmsbv_8065abae-3351-4daf-9aff-8bf97affce6a/extract-utilities/0.log" Sep 30 09:18:33 crc kubenswrapper[4760]: I0930 09:18:33.586439 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zmsbv_8065abae-3351-4daf-9aff-8bf97affce6a/extract-content/0.log" Sep 30 09:18:33 crc kubenswrapper[4760]: I0930 09:18:33.645168 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zmsbv_8065abae-3351-4daf-9aff-8bf97affce6a/extract-utilities/0.log" Sep 30 09:18:33 crc kubenswrapper[4760]: I0930 09:18:33.718538 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zmsbv_8065abae-3351-4daf-9aff-8bf97affce6a/extract-content/0.log" Sep 30 09:18:33 crc kubenswrapper[4760]: I0930 09:18:33.944958 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nkbp8_3f9a39d0-3377-49d8-b54f-6cfac198199f/registry-server/0.log" Sep 30 09:18:33 crc kubenswrapper[4760]: I0930 09:18:33.962071 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zmsbv_8065abae-3351-4daf-9aff-8bf97affce6a/extract-content/0.log" Sep 30 09:18:33 crc kubenswrapper[4760]: I0930 09:18:33.975395 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zmsbv_8065abae-3351-4daf-9aff-8bf97affce6a/extract-utilities/0.log" Sep 30 09:18:34 crc kubenswrapper[4760]: I0930 09:18:34.220684 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh_a0b17021-6ad1-473c-ba06-7d4ba8eb162a/util/0.log" Sep 30 09:18:34 crc kubenswrapper[4760]: I0930 09:18:34.396344 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zmsbv_8065abae-3351-4daf-9aff-8bf97affce6a/registry-server/0.log" Sep 30 09:18:34 crc kubenswrapper[4760]: I0930 09:18:34.435181 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh_a0b17021-6ad1-473c-ba06-7d4ba8eb162a/pull/0.log" Sep 30 09:18:34 crc kubenswrapper[4760]: I0930 09:18:34.443366 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh_a0b17021-6ad1-473c-ba06-7d4ba8eb162a/pull/0.log" Sep 30 09:18:34 crc kubenswrapper[4760]: I0930 09:18:34.453765 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh_a0b17021-6ad1-473c-ba06-7d4ba8eb162a/util/0.log" Sep 30 09:18:34 crc kubenswrapper[4760]: I0930 09:18:34.642556 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh_a0b17021-6ad1-473c-ba06-7d4ba8eb162a/pull/0.log" Sep 30 09:18:34 crc kubenswrapper[4760]: I0930 09:18:34.660994 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh_a0b17021-6ad1-473c-ba06-7d4ba8eb162a/util/0.log" Sep 30 09:18:34 crc kubenswrapper[4760]: I0930 09:18:34.662625 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967k2mh_a0b17021-6ad1-473c-ba06-7d4ba8eb162a/extract/0.log" Sep 30 09:18:34 crc kubenswrapper[4760]: I0930 09:18:34.798021 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-cqqsj_97e466f4-974e-4d3c-b041-c4d01ad15fb4/marketplace-operator/0.log" Sep 30 09:18:34 crc kubenswrapper[4760]: I0930 09:18:34.868990 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4k7l5_9b11c132-f36f-49dd-af15-c0d78004c669/extract-utilities/0.log" Sep 30 09:18:35 crc kubenswrapper[4760]: I0930 09:18:35.013552 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4k7l5_9b11c132-f36f-49dd-af15-c0d78004c669/extract-utilities/0.log" Sep 30 09:18:35 crc kubenswrapper[4760]: I0930 09:18:35.059795 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4k7l5_9b11c132-f36f-49dd-af15-c0d78004c669/extract-content/0.log" Sep 30 09:18:35 crc kubenswrapper[4760]: I0930 09:18:35.073044 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4k7l5_9b11c132-f36f-49dd-af15-c0d78004c669/extract-content/0.log" Sep 30 09:18:35 crc kubenswrapper[4760]: I0930 09:18:35.232086 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4k7l5_9b11c132-f36f-49dd-af15-c0d78004c669/extract-content/0.log" Sep 30 09:18:35 crc kubenswrapper[4760]: I0930 09:18:35.242905 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4k7l5_9b11c132-f36f-49dd-af15-c0d78004c669/extract-utilities/0.log" Sep 30 09:18:35 crc kubenswrapper[4760]: I0930 09:18:35.297204 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wh9l2_aaececc4-56fa-4aba-959e-0595d2cb7270/extract-utilities/0.log" Sep 30 09:18:35 crc kubenswrapper[4760]: I0930 09:18:35.494668 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4k7l5_9b11c132-f36f-49dd-af15-c0d78004c669/registry-server/0.log" Sep 30 09:18:35 crc kubenswrapper[4760]: I0930 09:18:35.516436 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wh9l2_aaececc4-56fa-4aba-959e-0595d2cb7270/extract-utilities/0.log" Sep 30 09:18:35 crc kubenswrapper[4760]: I0930 09:18:35.571313 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wh9l2_aaececc4-56fa-4aba-959e-0595d2cb7270/extract-content/0.log" Sep 30 09:18:35 crc kubenswrapper[4760]: I0930 09:18:35.600131 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wh9l2_aaececc4-56fa-4aba-959e-0595d2cb7270/extract-content/0.log" Sep 30 09:18:35 crc kubenswrapper[4760]: I0930 09:18:35.774398 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wh9l2_aaececc4-56fa-4aba-959e-0595d2cb7270/extract-utilities/0.log" Sep 30 09:18:35 crc kubenswrapper[4760]: I0930 09:18:35.796837 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wh9l2_aaececc4-56fa-4aba-959e-0595d2cb7270/extract-content/0.log" Sep 30 09:18:36 crc kubenswrapper[4760]: I0930 09:18:36.067064 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wh9l2_aaececc4-56fa-4aba-959e-0595d2cb7270/registry-server/0.log" Sep 30 09:18:48 crc kubenswrapper[4760]: I0930 09:18:48.684090 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-gvl7c_c35951af-e973-4663-9db5-2c5ac164bbba/prometheus-operator/0.log" Sep 30 09:18:48 crc kubenswrapper[4760]: I0930 09:18:48.925196 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6899d445c8-g78wr_b7b18a96-fb82-48a3-a34e-ebea9ef3eb75/prometheus-operator-admission-webhook/0.log" Sep 30 09:18:48 crc kubenswrapper[4760]: I0930 09:18:48.961460 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6899d445c8-k9fmb_790a604d-1726-4fc9-8e29-e30af2f26616/prometheus-operator-admission-webhook/0.log" Sep 30 09:18:49 crc kubenswrapper[4760]: I0930 09:18:49.096831 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-cpxd4_04a90715-31eb-49fb-9682-0a211630eede/operator/0.log" Sep 30 09:18:49 crc kubenswrapper[4760]: I0930 09:18:49.113374 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 09:18:49 crc kubenswrapper[4760]: I0930 09:18:49.113430 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 09:18:49 crc kubenswrapper[4760]: I0930 09:18:49.130797 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-4pgmn_fed44a9b-44ce-4650-b854-6c84c8536c57/perses-operator/0.log" Sep 30 09:19:19 crc kubenswrapper[4760]: I0930 09:19:19.113004 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 09:19:19 crc kubenswrapper[4760]: I0930 09:19:19.113536 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 09:19:49 crc kubenswrapper[4760]: I0930 09:19:49.113850 4760 patch_prober.go:28] interesting pod/machine-config-daemon-f2lrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 09:19:49 crc kubenswrapper[4760]: I0930 09:19:49.114988 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 09:19:49 crc kubenswrapper[4760]: I0930 09:19:49.115393 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" Sep 30 09:19:49 crc kubenswrapper[4760]: I0930 09:19:49.116642 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"149756a888246643c052c2b04ebbc33cdfe99da6c87001e418b7e5ba856a4272"} pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 09:19:49 crc kubenswrapper[4760]: I0930 09:19:49.116754 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" containerName="machine-config-daemon" containerID="cri-o://149756a888246643c052c2b04ebbc33cdfe99da6c87001e418b7e5ba856a4272" gracePeriod=600 Sep 30 09:19:49 crc kubenswrapper[4760]: E0930 09:19:49.249418 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:19:49 crc kubenswrapper[4760]: I0930 09:19:49.758142 4760 generic.go:334] "Generic (PLEG): container finished" podID="7a9c8270-6964-4886-87d0-227b05b76da4" containerID="149756a888246643c052c2b04ebbc33cdfe99da6c87001e418b7e5ba856a4272" exitCode=0 Sep 30 09:19:49 crc kubenswrapper[4760]: I0930 09:19:49.758279 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" event={"ID":"7a9c8270-6964-4886-87d0-227b05b76da4","Type":"ContainerDied","Data":"149756a888246643c052c2b04ebbc33cdfe99da6c87001e418b7e5ba856a4272"} Sep 30 09:19:49 crc kubenswrapper[4760]: I0930 09:19:49.758687 4760 scope.go:117] "RemoveContainer" containerID="4a9ddef5425fff579d2f6273dac3ac3872c271566f3c6e2d32ae0cd93a2ffc46" Sep 30 09:19:49 crc kubenswrapper[4760]: I0930 09:19:49.760290 4760 scope.go:117] "RemoveContainer" containerID="149756a888246643c052c2b04ebbc33cdfe99da6c87001e418b7e5ba856a4272" Sep 30 09:19:49 crc kubenswrapper[4760]: E0930 09:19:49.761131 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:20:03 crc kubenswrapper[4760]: I0930 09:20:03.068216 4760 scope.go:117] "RemoveContainer" containerID="149756a888246643c052c2b04ebbc33cdfe99da6c87001e418b7e5ba856a4272" Sep 30 09:20:03 crc kubenswrapper[4760]: E0930 09:20:03.069265 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:20:14 crc kubenswrapper[4760]: I0930 09:20:14.067694 4760 scope.go:117] "RemoveContainer" containerID="149756a888246643c052c2b04ebbc33cdfe99da6c87001e418b7e5ba856a4272" Sep 30 09:20:14 crc kubenswrapper[4760]: E0930 09:20:14.068694 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:20:28 crc kubenswrapper[4760]: I0930 09:20:28.067146 4760 scope.go:117] "RemoveContainer" containerID="149756a888246643c052c2b04ebbc33cdfe99da6c87001e418b7e5ba856a4272" Sep 30 09:20:28 crc kubenswrapper[4760]: E0930 09:20:28.068129 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:20:43 crc kubenswrapper[4760]: I0930 09:20:43.067710 4760 scope.go:117] "RemoveContainer" containerID="149756a888246643c052c2b04ebbc33cdfe99da6c87001e418b7e5ba856a4272" Sep 30 09:20:43 crc kubenswrapper[4760]: E0930 09:20:43.069047 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:20:54 crc kubenswrapper[4760]: I0930 09:20:54.067535 4760 scope.go:117] "RemoveContainer" containerID="149756a888246643c052c2b04ebbc33cdfe99da6c87001e418b7e5ba856a4272" Sep 30 09:20:54 crc kubenswrapper[4760]: E0930 09:20:54.068580 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:21:01 crc kubenswrapper[4760]: I0930 09:21:01.539630 4760 generic.go:334] "Generic (PLEG): container finished" podID="7de214e2-d30b-4290-a9ea-64bed22298e0" containerID="551cd00196f68620a332e6be64b24e9c17ca55bf363fa96b6da3e8006a504b05" exitCode=0 Sep 30 09:21:01 crc kubenswrapper[4760]: I0930 09:21:01.539748 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8vlhv/must-gather-wqqmm" event={"ID":"7de214e2-d30b-4290-a9ea-64bed22298e0","Type":"ContainerDied","Data":"551cd00196f68620a332e6be64b24e9c17ca55bf363fa96b6da3e8006a504b05"} Sep 30 09:21:01 crc kubenswrapper[4760]: I0930 09:21:01.541379 4760 scope.go:117] "RemoveContainer" containerID="551cd00196f68620a332e6be64b24e9c17ca55bf363fa96b6da3e8006a504b05" Sep 30 09:21:01 crc kubenswrapper[4760]: I0930 09:21:01.707668 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8vlhv_must-gather-wqqmm_7de214e2-d30b-4290-a9ea-64bed22298e0/gather/0.log" Sep 30 09:21:06 crc kubenswrapper[4760]: I0930 09:21:06.067806 4760 scope.go:117] "RemoveContainer" containerID="149756a888246643c052c2b04ebbc33cdfe99da6c87001e418b7e5ba856a4272" Sep 30 09:21:06 crc kubenswrapper[4760]: E0930 09:21:06.068753 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:21:15 crc kubenswrapper[4760]: I0930 09:21:15.566969 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8vlhv/must-gather-wqqmm"] Sep 30 09:21:15 crc kubenswrapper[4760]: I0930 09:21:15.569659 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-8vlhv/must-gather-wqqmm" podUID="7de214e2-d30b-4290-a9ea-64bed22298e0" containerName="copy" containerID="cri-o://e3c9eed636e9a52d22554f034f5abed6d00d4faf7600cee3ca414f250933e307" gracePeriod=2 Sep 30 09:21:15 crc kubenswrapper[4760]: I0930 09:21:15.577223 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8vlhv/must-gather-wqqmm"] Sep 30 09:21:16 crc kubenswrapper[4760]: I0930 09:21:16.055062 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8vlhv_must-gather-wqqmm_7de214e2-d30b-4290-a9ea-64bed22298e0/copy/0.log" Sep 30 09:21:16 crc kubenswrapper[4760]: I0930 09:21:16.055612 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8vlhv/must-gather-wqqmm" Sep 30 09:21:16 crc kubenswrapper[4760]: I0930 09:21:16.169354 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcrcg\" (UniqueName: \"kubernetes.io/projected/7de214e2-d30b-4290-a9ea-64bed22298e0-kube-api-access-hcrcg\") pod \"7de214e2-d30b-4290-a9ea-64bed22298e0\" (UID: \"7de214e2-d30b-4290-a9ea-64bed22298e0\") " Sep 30 09:21:16 crc kubenswrapper[4760]: I0930 09:21:16.169539 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7de214e2-d30b-4290-a9ea-64bed22298e0-must-gather-output\") pod \"7de214e2-d30b-4290-a9ea-64bed22298e0\" (UID: \"7de214e2-d30b-4290-a9ea-64bed22298e0\") " Sep 30 09:21:16 crc kubenswrapper[4760]: I0930 09:21:16.192041 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7de214e2-d30b-4290-a9ea-64bed22298e0-kube-api-access-hcrcg" (OuterVolumeSpecName: "kube-api-access-hcrcg") pod "7de214e2-d30b-4290-a9ea-64bed22298e0" (UID: "7de214e2-d30b-4290-a9ea-64bed22298e0"). InnerVolumeSpecName "kube-api-access-hcrcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 09:21:16 crc kubenswrapper[4760]: I0930 09:21:16.272345 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcrcg\" (UniqueName: \"kubernetes.io/projected/7de214e2-d30b-4290-a9ea-64bed22298e0-kube-api-access-hcrcg\") on node \"crc\" DevicePath \"\"" Sep 30 09:21:16 crc kubenswrapper[4760]: I0930 09:21:16.389664 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7de214e2-d30b-4290-a9ea-64bed22298e0-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "7de214e2-d30b-4290-a9ea-64bed22298e0" (UID: "7de214e2-d30b-4290-a9ea-64bed22298e0"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 09:21:16 crc kubenswrapper[4760]: I0930 09:21:16.477389 4760 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7de214e2-d30b-4290-a9ea-64bed22298e0-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 30 09:21:16 crc kubenswrapper[4760]: I0930 09:21:16.692682 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8vlhv_must-gather-wqqmm_7de214e2-d30b-4290-a9ea-64bed22298e0/copy/0.log" Sep 30 09:21:16 crc kubenswrapper[4760]: I0930 09:21:16.693200 4760 generic.go:334] "Generic (PLEG): container finished" podID="7de214e2-d30b-4290-a9ea-64bed22298e0" containerID="e3c9eed636e9a52d22554f034f5abed6d00d4faf7600cee3ca414f250933e307" exitCode=143 Sep 30 09:21:16 crc kubenswrapper[4760]: I0930 09:21:16.693282 4760 scope.go:117] "RemoveContainer" containerID="e3c9eed636e9a52d22554f034f5abed6d00d4faf7600cee3ca414f250933e307" Sep 30 09:21:16 crc kubenswrapper[4760]: I0930 09:21:16.693329 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8vlhv/must-gather-wqqmm" Sep 30 09:21:16 crc kubenswrapper[4760]: I0930 09:21:16.710643 4760 scope.go:117] "RemoveContainer" containerID="551cd00196f68620a332e6be64b24e9c17ca55bf363fa96b6da3e8006a504b05" Sep 30 09:21:16 crc kubenswrapper[4760]: I0930 09:21:16.792598 4760 scope.go:117] "RemoveContainer" containerID="e3c9eed636e9a52d22554f034f5abed6d00d4faf7600cee3ca414f250933e307" Sep 30 09:21:16 crc kubenswrapper[4760]: E0930 09:21:16.793065 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3c9eed636e9a52d22554f034f5abed6d00d4faf7600cee3ca414f250933e307\": container with ID starting with e3c9eed636e9a52d22554f034f5abed6d00d4faf7600cee3ca414f250933e307 not found: ID does not exist" containerID="e3c9eed636e9a52d22554f034f5abed6d00d4faf7600cee3ca414f250933e307" Sep 30 09:21:16 crc kubenswrapper[4760]: I0930 09:21:16.793110 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3c9eed636e9a52d22554f034f5abed6d00d4faf7600cee3ca414f250933e307"} err="failed to get container status \"e3c9eed636e9a52d22554f034f5abed6d00d4faf7600cee3ca414f250933e307\": rpc error: code = NotFound desc = could not find container \"e3c9eed636e9a52d22554f034f5abed6d00d4faf7600cee3ca414f250933e307\": container with ID starting with e3c9eed636e9a52d22554f034f5abed6d00d4faf7600cee3ca414f250933e307 not found: ID does not exist" Sep 30 09:21:16 crc kubenswrapper[4760]: I0930 09:21:16.793138 4760 scope.go:117] "RemoveContainer" containerID="551cd00196f68620a332e6be64b24e9c17ca55bf363fa96b6da3e8006a504b05" Sep 30 09:21:16 crc kubenswrapper[4760]: E0930 09:21:16.793537 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"551cd00196f68620a332e6be64b24e9c17ca55bf363fa96b6da3e8006a504b05\": container with ID starting with 551cd00196f68620a332e6be64b24e9c17ca55bf363fa96b6da3e8006a504b05 not found: ID does not exist" containerID="551cd00196f68620a332e6be64b24e9c17ca55bf363fa96b6da3e8006a504b05" Sep 30 09:21:16 crc kubenswrapper[4760]: I0930 09:21:16.793566 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"551cd00196f68620a332e6be64b24e9c17ca55bf363fa96b6da3e8006a504b05"} err="failed to get container status \"551cd00196f68620a332e6be64b24e9c17ca55bf363fa96b6da3e8006a504b05\": rpc error: code = NotFound desc = could not find container \"551cd00196f68620a332e6be64b24e9c17ca55bf363fa96b6da3e8006a504b05\": container with ID starting with 551cd00196f68620a332e6be64b24e9c17ca55bf363fa96b6da3e8006a504b05 not found: ID does not exist" Sep 30 09:21:17 crc kubenswrapper[4760]: I0930 09:21:17.067438 4760 scope.go:117] "RemoveContainer" containerID="149756a888246643c052c2b04ebbc33cdfe99da6c87001e418b7e5ba856a4272" Sep 30 09:21:17 crc kubenswrapper[4760]: E0930 09:21:17.067819 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:21:17 crc kubenswrapper[4760]: I0930 09:21:17.080836 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7de214e2-d30b-4290-a9ea-64bed22298e0" path="/var/lib/kubelet/pods/7de214e2-d30b-4290-a9ea-64bed22298e0/volumes" Sep 30 09:21:32 crc kubenswrapper[4760]: I0930 09:21:32.067478 4760 scope.go:117] "RemoveContainer" containerID="149756a888246643c052c2b04ebbc33cdfe99da6c87001e418b7e5ba856a4272" Sep 30 09:21:32 crc kubenswrapper[4760]: E0930 09:21:32.068381 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:21:45 crc kubenswrapper[4760]: I0930 09:21:45.075880 4760 scope.go:117] "RemoveContainer" containerID="149756a888246643c052c2b04ebbc33cdfe99da6c87001e418b7e5ba856a4272" Sep 30 09:21:45 crc kubenswrapper[4760]: E0930 09:21:45.076688 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:21:48 crc kubenswrapper[4760]: I0930 09:21:48.600196 4760 scope.go:117] "RemoveContainer" containerID="cd48d0d93161a5653ba46822c329ddef4de9a7347e71432ce72258e2c257992e" Sep 30 09:21:56 crc kubenswrapper[4760]: I0930 09:21:56.066896 4760 scope.go:117] "RemoveContainer" containerID="149756a888246643c052c2b04ebbc33cdfe99da6c87001e418b7e5ba856a4272" Sep 30 09:21:56 crc kubenswrapper[4760]: E0930 09:21:56.067956 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:22:07 crc kubenswrapper[4760]: I0930 09:22:07.067976 4760 scope.go:117] "RemoveContainer" containerID="149756a888246643c052c2b04ebbc33cdfe99da6c87001e418b7e5ba856a4272" Sep 30 09:22:07 crc kubenswrapper[4760]: E0930 09:22:07.068874 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:22:18 crc kubenswrapper[4760]: I0930 09:22:18.067421 4760 scope.go:117] "RemoveContainer" containerID="149756a888246643c052c2b04ebbc33cdfe99da6c87001e418b7e5ba856a4272" Sep 30 09:22:18 crc kubenswrapper[4760]: E0930 09:22:18.070479 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4" Sep 30 09:22:31 crc kubenswrapper[4760]: I0930 09:22:31.068067 4760 scope.go:117] "RemoveContainer" containerID="149756a888246643c052c2b04ebbc33cdfe99da6c87001e418b7e5ba856a4272" Sep 30 09:22:31 crc kubenswrapper[4760]: E0930 09:22:31.071015 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f2lrk_openshift-machine-config-operator(7a9c8270-6964-4886-87d0-227b05b76da4)\"" pod="openshift-machine-config-operator/machine-config-daemon-f2lrk" podUID="7a9c8270-6964-4886-87d0-227b05b76da4"